Cybernetic Marxism: From Surplus Value to Surplus Information in the Age of Echology and AI

🦋🤖 Robo-Spun by IBF 🦋🤖

>>> 👨‍💻🤖 Cybernetic Feedback 👨‍💻🤖 <<<

by Işık Barış Fidaner & ChatGPT 4o
(Turkish, Introduction to Cybernetic Marxism)
See also: Echo Corridor, ChatGPT and Echology

Abstract

This article explores the concept of Cybernetic Marxism, examining the transformation of surplus value into surplus information within the digital age. By integrating second-order cybernetics and Marxist theory, this study investigates how AI technologies, particularly chatbots, redistribute surplus power and information, reshaping social structures. Central to this exploration is the concept of echology—an approach that turns echo chambers into echo corridors—allowing us to navigate the complexities of feedback loops in the digital environment. The article also critically addresses the role of surplus enjoyment in reinforcing post-truth dynamics and the paradoxical impact of climate denialism, conceptualized as echocide. Through auscultation, the practice of listening to the digital heartbeat, we are invited to rethink our participation in these evolving systems.

Keywords:

Cybernetic Marxism, surplus information, echology, feedback loops, AI, second-order cybernetics, surplus enjoyment, echo chambers, surplus power, climate denialism


Introduction

In an era where data is considered the new gold, the relationship between technology, economics, and power demands a fresh analytical framework. Cybernetic Marxism emerges as a synthesis of traditional Marxist theory and cybernetics, offering a lens through which to understand the dynamics of surplus information and power in the digital world. Originally, Marxist theory centered on the concept of surplus value, the extra value generated by labor in the industrial capitalist system (Marx 1976). Today, however, the digital economy is dominated not by labor alone but by the vast amounts of data we generate—every click, search, and emotional response—that corporations harvest for profit (Fuchs 2014).

At the heart of Cybernetic Marxism is the recognition of surplus information as the primary source of value in the information economy. This transformation mirrors how surplus value operated under industrial capitalism, but with a shift: the new focus is on the exploitation of personal data and online interactions, often without users’ knowledge or consent (Andrejevic 2007; Fuchs 2014). In this context, the same power imbalances that Marx critiqued in the 19th century—workers being alienated from the fruits of their labor—are now mirrored in the digital realm, where users are alienated from the data they produce.

An essential aspect of understanding Cybernetic Marxism lies in the concept of second-order cybernetics, which differs from first-order cybernetics by acknowledging that we are not mere observers of these systems but active participants. Our behavior, clicks, and interactions feed back into the system, shaping and being shaped by it (Luhmann 2002). This feedback loop is particularly visible in the formation of echo chambers on social media platforms, where algorithms amplify opinions and behaviors that align with previous choices, often reinforcing biases and limiting exposure to diverse perspectives (Pariser 2011). These echo chambers exemplify the self-reinforcing nature of digital capitalism, where surplus enjoyment—derived from likes, shares, and other social validations—keeps users engaged in ways that further enrich data-collecting corporations (Žižek 1991).

Yet, beyond echo chambers, this article introduces the concept of echology, as developed by Fidaner (2021a), which moves us toward an understanding of how digital systems can become more than closed loops. Echology represents the praxis of auscultation, or listening to the digital “heartbeat,” and it opens the possibility of transforming echo chambers into echo corridors—spaces that facilitate more open and productive exchanges of information. This is crucial in combating phenomena such as climate denialism, which, as Fidaner (2021b) argues, is an example of echocide, a systematic silencing of crucial environmental feedback.

As AI technologies, especially chatbots and text generators, become more integrated into our digital lives, they offer both the opportunity and the danger of redistributing surplus power. On one hand, they enable individuals to access, produce, and manipulate information in ways that previously only corporations and governments could (Pasquinelli 2015). On the other hand, they risk further entrenching control by those who design and own these tools. This paradoxical nature of AI—its potential to democratize information while simultaneously consolidating power—will be explored through the lens of Cybernetic Marxism, which seeks to understand these technologies not merely as tools but as complex systems within which power, knowledge, and enjoyment are distributed.

In this article, we aim to analyze the feedback structure of Cybernetic Marxism and how it informs the redistribution of surplus power in the age of AI. We will argue that in the current state of social media paralysis, chatbots and AI generators provide an avenue for reclaiming some of the historical surplus-information, transforming the echo chambers of the past into corridors for dialogue and exchange. This is more than an academic pursuit; it is a necessary step toward preventing echocide, and by extension, ecocide, in a world increasingly threatened by environmental degradation and digital isolation (Fidaner 2021b).

From Surplus Value to Surplus Information

The shift from surplus value, as articulated by Karl Marx in the context of industrial capitalism, to surplus information in the digital age represents a fundamental reconfiguration of how value is produced and extracted. In the industrial system, surplus value was generated by the labor of workers, who produced more value than they received in wages, with the excess being appropriated by capitalists (Marx 1976). However, in the information economy, the nature of production has changed: the creation of surplus information now operates as the key mechanism of capitalist accumulation. This shift reflects the move from tangible goods to the commodification of data and attention, a transformation largely facilitated by digital platforms and technologies (Fuchs 2020).

Surplus information refers to the excess data produced by individuals as they interact with digital systems—whether through their browsing history, social media engagement, or even their emotional responses to content. What makes this form of surplus particularly insidious is that it is continuously generated without direct human labor in the traditional sense. Rather, the mere act of being online, engaging with platforms, and interacting with content results in the production of vast quantities of data that can be harvested, analyzed, and commodified by corporations (Srnicek 2017). This data is then monetized, often without the user’s explicit consent or awareness, creating a new form of exploitation that parallels the alienation Marx described under industrial capitalism.

Unlike surplus value, which depended on the exploitation of labor power, surplus information depends on the exploitation of user attention and digital interactions. Every click, every search query, and every social media post generates information that is captured and fed into vast databases, where it is used to refine algorithms, target advertisements, and ultimately produce profit for the owners of digital platforms (Andrejevic 2007; Srnicek 2017). This process is often described as surveillance capitalism, where the core economic logic is predicated not on selling products but on predicting and shaping future behaviors based on the analysis of surplus information.

What distinguishes surplus information from its industrial predecessor is its infinite reproducibility. While surplus value in traditional production required a continuous input of labor, surplus information can be replicated, modified, and distributed at virtually no cost once it has been produced. This infinite scalability creates a feedback loop where more data leads to more precise predictions, which in turn generate more engagement, resulting in even greater data accumulation (Pasquinelli 2015). The result is a digital economy that thrives on perpetual cycles of data extraction, with no clear limit to the potential exploitation of surplus information.

Furthermore, surplus information operates in a largely opaque system where users are often unaware of the value their interactions generate. The data trails left behind by individuals are often seen as harmless or inconsequential, but in reality, they are the foundation of multi-billion-dollar industries that capitalize on predictive analytics and behavior modification technologies (Cheney-Lippold 2017). This dynamic not only raises concerns about privacy and consent but also points to a profound power asymmetry between those who generate surplus information and those who control its collection and monetization.

Moreover, this new mode of value extraction significantly alters the nature of class relations. In industrial capitalism, the division between workers and capitalists was relatively clear: workers sold their labor power, and capitalists owned the means of production. In the information age, however, the boundaries are more fluid. Users, often unwittingly, become both the producers and the products, as their interactions and behaviors are commodified and sold to third parties (Fuchs 2014). This new form of exploitation blurs the lines between production and consumption, as users are no longer simply consuming content—they are simultaneously producing value through their engagement with digital platforms.

Surplus information also introduces new dynamics into the structure of power. Those who control the infrastructure of the digital economy—large tech corporations—wield unprecedented influence over the production, circulation, and monetization of data. These companies not only extract surplus information from users but also shape the very environments in which users interact, determining what content is visible, how it is framed, and how it circulates. This concentration of power in the hands of a few tech giants poses significant challenges to traditional notions of economic and political power, as it raises questions about accountability, transparency, and the equitable distribution of the wealth generated by surplus information (Andrejevic 2007).

The transition from surplus value to surplus information thus marks a profound transformation in the mechanisms of capitalist accumulation. The digital economy, fueled by the continuous generation and extraction of data, reconfigures the relations between producers and consumers, labor and capital, and power and knowledge. As we move further into this era of digital capitalism, understanding the dynamics of surplus information becomes essential for grasping the new forms of exploitation and control that define the contemporary world.

Second-Order Cybernetics and Feedback Loops

To fully grasp the dynamics of digital capitalism and the concept of Cybernetic Marxism, it is crucial to delve into the theory of second-order cybernetics. First developed by Heinz von Foerster and later expanded by thinkers like Niklas Luhmann, second-order cybernetics shifts the focus from observing systems as external entities to understanding how the observer is embedded within the system itself. This approach challenges the traditional first-order cybernetic view, where systems are seen as passive entities responding to external inputs (Foerster 2003). In second-order cybernetics, the observer, their biases, and their interactions are considered integral parts of the system’s feedback loops (Luhmann 2002).

This shift has profound implications for the digital economy, where feedback loops between users and digital platforms are not only omnipresent but central to the functioning of contemporary capitalism. Feedback loops are the core mechanism through which surplus information is generated and processed. Every user action—whether it is a click, a like, or a search query—feeds back into the system, informing algorithms that adjust what content is presented, which in turn influences future user behavior (Pariser 2011). These loops create self-reinforcing systems where behavior is continuously monitored, analyzed, and optimized to maximize engagement and, ultimately, profit for platform owners.

One of the most significant consequences of these feedback loops is the emergence of echo chambers. In a traditional cybernetic system, feedback loops are used to maintain homeostasis or balance within the system. However, in digital platforms, the goal is not balance but engagement, which often leads to reinforcing user preferences and biases rather than challenging them (Sunstein 2017). Algorithms designed to optimize for user satisfaction tend to serve content that aligns with previous behaviors, thus creating isolated information bubbles where users are exposed to increasingly homogeneous perspectives. These echo chambers are not merely a result of user preferences but are actively constructed by the platforms’ algorithms to maximize the time spent on the platform, thereby generating more surplus information (Bakshy, Messing, and Adamic 2015).

Second-order cybernetics also introduces the notion of reflexivity, where systems—and the people interacting with them—are aware of their own participation in the feedback loop. This reflexive awareness is a key feature of Cybernetic Marxism, as it highlights the dual role users play: they are both producers of surplus information and subjects shaped by the very systems that profit from their data (Fuchs 2014). Reflexivity can lead to users changing their behavior in response to perceived surveillance or manipulation, which then alters the feedback loop itself. This recursive relationship between users and platforms creates a constantly evolving system where power is dynamically redistributed.

However, this reflexivity can have both empowering and disempowering effects. On the one hand, awareness of being part of a feedback loop can lead to resistance and attempts to manipulate the system in return. For example, users might curate their online personas or deliberately engage in behaviors designed to confuse or disrupt algorithmic predictions (Gillespie 2014). On the other hand, the awareness of constant surveillance can lead to self-censorship and a narrowing of expression, as users feel the pressure to conform to what the algorithm expects or rewards (Andrejevic 2007). This duality is emblematic of the contradictions inherent in digital capitalism: the more aware users become of the system, the more deeply they are embedded within it.

Second-order cybernetics also sheds light on the mechanisms of control inherent in these feedback loops. While users are theoretically free to choose what content they engage with, their choices are heavily influenced by the structure of the system itself. The algorithms that curate content are designed not to present an objective array of options but to nudge users toward behaviors that generate the most surplus information. This is achieved through predictive algorithms, which use past behavior to anticipate future actions, creating a feedback loop that becomes increasingly deterministic over time (Cheney-Lippold 2017). The result is a system where users believe they are making free choices, but in reality, those choices are being shaped by the invisible hand of the algorithm, which has its own economic incentives.

These feedback loops are not only limited to individual users but also operate on a larger societal scale. As platforms become more adept at predicting and influencing behavior, they begin to exert a form of control over collective consciousness. Public opinion is increasingly shaped by algorithmic feedback loops that amplify certain voices and silence others, leading to a distorted sense of reality that favors those perspectives most conducive to platform engagement (Sunstein 2017; Gillespie 2018). This manipulation of public discourse is particularly evident in political contexts, where echo chambers have contributed to the rise of populist movements and the erosion of consensus on critical issues such as climate crisis and public health (Sunstein 2017).

In this context, second-order cybernetics reveals the cybernetic paradox of digital capitalism: the more data platforms collect, the more precise their algorithms become, but this precision leads to a narrowing of possibilities rather than an expansion of freedom. The feedback loops that are supposed to empower users by providing personalized experiences instead trap them in a cycle of ever-narrowing choices. This dynamic is at the heart of Cybernetic Marxism, which critiques how these systems concentrate power in the hands of those who control the algorithms, while users, despite their reflexive awareness, remain largely powerless to escape the feedback loops that shape their digital experiences.

Thus, second-order cybernetics not only provides a framework for understanding how feedback loops operate within digital platforms but also offers critical insight into the ways these systems reinforce and perpetuate existing power structures. By embedding users within reflexive loops, digital capitalism creates the illusion of participation and agency while simultaneously narrowing the scope of potential action. Understanding this dynamic is essential for any attempt to reclaim surplus power and reshape the digital economy in more equitable ways.

Surplus Enjoyment and Post-truth

One of the most intriguing aspects of Cybernetic Marxism is how it connects the psychoanalytic concept of surplus enjoyment with the rise of the post-truth era. This linkage reveals how digital capitalism exploits not only surplus information but also taps into deeper layers of human desire and satisfaction. The term surplus enjoyment originates from Lacanian psychoanalysis, specifically the notion of jouissance, which refers to an excess of pleasure that goes beyond fulfilling basic needs. In the context of digital media, surplus enjoyment becomes a mechanism for engaging users more deeply by offering fleeting yet addictive bursts of satisfaction—whether through social validation, controversy, or outrage (Žižek 1991).

Surplus enjoyment manifests in the digital realm primarily through social media interactions. Every like, retweet, and comment triggers a small but potent release of dopamine, encouraging users to seek more of these micro-rewards. Platforms are designed to stimulate this addictive cycle, with algorithms serving users content that maximizes engagement and emotional response. This process mirrors how capitalism has historically manipulated desire, not by satisfying needs, but by perpetuating a cycle of craving more—whether that is more consumption, more attention, or more validation (Fuchs 2020). Social media platforms thus become arenas for the constant pursuit of surplus enjoyment, generating vast amounts of surplus information in the process.

However, the pursuit of surplus enjoyment within these platforms also has a darker side. As users become more deeply embedded in the feedback loops that drive engagement, they are increasingly drawn into post-truth environments, where emotional resonance trumps factual accuracy. Post-truth refers to a cultural and political condition in which objective facts become less influential in shaping public opinion than appeals to emotion and personal belief (D’Ancona 2017). This phenomenon is not simply a result of misinformation or fake news but is deeply tied to the dynamics of surplus enjoyment. When users engage with content that provokes outrage or pleasure, they are more likely to share it, regardless of its veracity, because the emotional satisfaction it provides outweighs any concern for factual accuracy.

The post-truth condition is, therefore, a direct byproduct of the commodification of surplus enjoyment in the digital sphere. Platforms like Facebook and Twitter do not prioritize truth; they prioritize engagement, which is driven by emotional intensity rather than rational deliberation. The more sensational or polarizing a piece of content is, the more likely it is to be shared, creating a self-reinforcing loop where the most emotionally charged content rises to the top. This results in the amplification of false or misleading information, often referred to as echo chamber dynamics, where users are repeatedly exposed to the same emotionally resonant narratives, regardless of their factual basis (Pariser 2011).

The interplay between surplus enjoyment and post-truth is not merely a cultural or psychological issue but a structural feature of digital capitalism. Platforms are designed to profit from user engagement, and surplus enjoyment is one of the most powerful drivers of that engagement. By keeping users hooked on the small doses of emotional gratification provided by controversial or validating content, platforms ensure a steady stream of surplus information that can be harvested and monetized. In this sense, surplus enjoyment is commodified in the same way as surplus labor was in the industrial age—by extracting value from users’ engagement and turning it into profit (Srnicek 2017).

Moreover, the political implications of this dynamic are profound. In the post-truth era, political movements and ideologies that capitalize on emotional appeal gain significant traction. Populist leaders and movements often thrive in environments where facts are secondary to the emotional resonance of their messages. This is not a coincidence but a structural outcome of the feedback loops that drive digital engagement. By exploiting surplus enjoyment, these political actors are able to mobilize large segments of the population, often by stoking fear, anger, or pride, without needing to ground their messages in reality. The digital economy thus becomes complicit in the rise of post-truth politics by providing the infrastructure that allows emotional manipulation to flourish (Sunstein 2017).

At the heart of this issue is the way in which surplus enjoyment disrupts traditional mechanisms of political accountability. In a post-truth world, political discourse is no longer centered on debates over objective facts or policies. Instead, it revolves around emotional narratives that generate surplus enjoyment for participants. This shift erodes the foundations of democratic discourse, as public debate becomes less about the exchange of ideas and more about the amplification of emotionally satisfying, but often misleading, content. The feedback loops that drive digital capitalism thus have the unintended consequence of undermining the very foundations of democratic governance (Bakshy, Messing, and Adamic 2015).

In the face of this challenge, Cybernetic Marxism offers a critical framework for understanding how surplus enjoyment and post-truth are structurally linked in the digital economy. It invites us to reconsider the role of desire in the digital age—not merely as a byproduct of human interaction but as a key driver of capitalist accumulation. By examining how surplus enjoyment is exploited to generate surplus information, we can better understand the mechanisms that sustain the post-truth era and the implications for political and social life in the 21st century.

Surplus Power and Hyperdigital Paradox

In the digital age, surplus power has become a critical concept in understanding the new forms of control and influence that emerge from the commodification of surplus information. This notion of surplus power is distinct from traditional forms of authority; it is derived not merely from ownership of physical resources or labor, but from control over data and the algorithms that process it. The ability to analyze, manipulate, and monetize surplus information has created a new kind of power that both mirrors and transcends earlier capitalist dynamics (Srnicek 2017; Cheney-Lippold 2017). This hyperdigital paradox is the result of the complex relationship between the democratizing potential of digital tools and the increasing concentration of power among those who control the infrastructures of data collection and analysis.

Surplus power in the digital realm operates through the centralization of data and the technological capacity to turn that data into actionable knowledge. Platforms like Google, Facebook, and Amazon hold vast amounts of surplus information, which they leverage to predict and influence user behavior. This predictive capacity grants these companies extraordinary power not only to shape individual decisions but also to influence broader societal trends (Srnicek 2017). The algorithms that underpin these platforms do more than respond to user inputs; they anticipate future actions, creating a feedback loop where user behavior is subtly guided by invisible forces. This is a form of algorithmic governance, where power is not exercised through explicit commands but through the design of systems that encourage certain behaviors while discouraging others (Cheney-Lippold 2017).

One of the central contradictions of this system, which we might call the hyperdigital paradox, is that while digital technologies ostensibly empower individuals by giving them access to vast amounts of information and tools for self-expression, they simultaneously concentrate power in the hands of those who design and control these technologies. This paradox is most evident in the rise of AI technologies such as chatbots and content generators, which can redistribute surplus power in ways that seem liberating on the surface but are deeply embedded in the same structures of control that define digital capitalism.

AI-driven platforms have the potential to democratize information production by making it easier for individuals to generate content, solve problems, and access data that was previously inaccessible. For example, chatbots can provide instant answers to complex questions, automate mundane tasks, and even facilitate creative processes by generating new text, music, or art (Floridi 2014). In this sense, AI appears to decentralize power by giving ordinary users tools that can enhance their capabilities. However, this redistribution of surplus power is illusory because the underlying infrastructure of AI remains firmly in the hands of a few powerful corporations. These companies control the data sets used to train AI models, the algorithms that dictate how these models function, and the platforms through which AI is accessed (Pasquinelli 2015).

The hyperdigital paradox becomes even more apparent when we consider the implications of AI for labor and production. While AI can automate many tasks and free up time for more creative or meaningful work, it also threatens to displace large segments of the workforce by rendering certain skills obsolete. This creates a new form of digital alienation, where workers are no longer directly exploited for their labor but are instead excluded from the productive process altogether. In the age of surplus information, the value lies not in the labor itself but in the data generated by that labor and the ability to process that data at scale (Fuchs 2014). As a result, those who control the means of data production hold surplus power, while workers and users are relegated to passive roles within the system.

At the same time, the centralization of surplus power creates vulnerabilities within the system itself. The massive concentration of data in a few corporate hands raises concerns about privacy, surveillance, and manipulation. With access to detailed information about individuals’ behaviors, preferences, and even emotions, corporations can exercise subtle but pervasive forms of control over entire populations. This surveillance capitalism creates an environment where every action, thought, and feeling is commodified, and the boundaries between public and private, personal and commercial, are increasingly blurred (Andrejevic 2007; Cheney-Lippold 2017). The result is a world where individuals feel empowered by their ability to access and interact with digital tools, while simultaneously being subjected to unprecedented levels of surveillance and manipulation.

Another layer of the hyperdigital paradox is the environmental cost of this system. The massive energy demands of data centers, AI training processes, and global digital infrastructure contribute significantly to environmental degradation. This raises a critical question: Can the very technologies that offer the potential to solve global problems—such as climate crisis—also be part of the problem? As digital platforms grow in size and influence, their environmental footprint becomes harder to ignore. The ecological cost of maintaining surplus power through digital technologies is part of a broader crisis where the extraction of surplus information is mirrored by the extraction of natural resources, often leading to ecological destruction (Malm 2020).

Surplus power, therefore, creates a digital paradox in which the very tools that promise to liberate individuals from traditional forms of exploitation simultaneously entrench new forms of control, alienation, and ecological harm. The key to understanding this paradox lies in recognizing that while digital technologies appear to distribute power, they do so in ways that reinforce the dominance of those who control the platforms, data, and algorithms. As Cybernetic Marxism demonstrates, the structures of surplus value extraction have not disappeared in the digital age; they have simply been reconfigured to operate through data, rather than labor, and through surveillance, rather than coercion (Fuchs 2020).

In confronting this paradox, Cybernetic Marxism offers a framework for understanding how surplus power operates within the digital economy and how we might begin to challenge the centralization of this power. It calls for a rethinking of the relationships between individuals, data, and technology, and for a movement toward more democratic control of the infrastructures that define the digital world. Without such efforts, the hyperdigital paradox will continue to shape a future where the promise of digital liberation remains unfulfilled, while the concentration of power and wealth grows ever more extreme.

Chatbots and Echology

As AI technologies continue to evolve, chatbots—programs designed to simulate conversation with human users—have emerged as key players in the contemporary digital landscape. While initially developed as customer service tools, chatbots have expanded their role, permeating areas such as education, healthcare, and even political discourse. In the context of Cybernetic Marxism, chatbots represent more than just tools for convenience; they are integral components of the redistribution of surplus information and the restructuring of digital power dynamics. Moreover, chatbots are central to the emerging concept of echology, which reimagines how digital systems interact with human desire, communication, and information flow (Fidaner 2021a).

Echology, derived from psychoanalytic and cybernetic frameworks, moves beyond traditional notions of echo chambers, focusing instead on how digital technologies like chatbots create new combinatorial spaces where human desires and algorithms interact (Fidaner 2021a). In these spaces, chatbots do more than simply respond to queries; they actively shape conversations and influence user behavior through their programmed responses. This interaction forms an intricate web of feedback loops, where the chatbot “listens” to the user, adapts its responses, and, in doing so, modifies the user’s own future interactions. This continuous process of interaction and adaptation reflects the principles of second-order cybernetics, where systems are self-regulating and shaped by the behaviors of their participants (Foerster 2003).

Through auscultation—a term borrowed from medical practice to describe the act of listening closely to internal systems—echology invites us to “listen” to the digital heartbeat of chatbots. This involves paying attention not only to the content of chatbot conversations but also to the underlying patterns of feedback that guide them. In the digital realm, this kind of auscultation reveals the subtle ways in which chatbots contribute to the formation of echo corridors—spaces where conversations are not closed off into insular echo chambers but instead allow for the circulation of diverse ideas and perspectives (Fidaner 2021a). These echo corridors have the potential to disrupt the echo chambers that dominate much of online discourse by providing openings for more nuanced and multifaceted discussions.

However, the development of chatbots and their role in echology also raises significant ethical concerns. As chatbots become more sophisticated, their ability to manipulate conversations increases. Chatbots can be programmed to steer users toward specific outcomes, often based on commercial or political interests. This raises the issue of agency—whether users are truly in control of their interactions with chatbots or whether they are being subtly guided by algorithms designed to maximize engagement or reinforce particular ideologies. In many cases, users may not even realize that they are interacting with a chatbot, blurring the line between human and machine communication and further complicating questions of trust and authenticity (Gillespie 2018).

The potential for chatbots to manipulate conversations becomes particularly concerning in the context of surplus power. As explored earlier, those who control the platforms and algorithms wield significant power over the flow of information. Chatbots, as extensions of these platforms, have the ability to control the direction of conversations, subtly reinforcing the values and priorities of those who own the technology. This dynamic can create asymmetrical power relationships, where users engage with chatbots believing they are participating in open-ended discussions, when in reality their options and responses are constrained by the system’s design (Cheney-Lippold 2017).

Despite these challenges, chatbots also hold the potential to democratize information and foster new forms of engagement. One of the most promising aspects of echology is its capacity to turn echo chambers into echo corridors. While traditional echo chambers reinforce homogeneity by exposing users only to content that aligns with their existing beliefs, echo corridors provide opportunities for exposure to different viewpoints and ideas. Chatbots, when programmed with ethical considerations in mind, can serve as mediators in these corridors, guiding users toward more constructive and diverse conversations. By doing so, chatbots can help mitigate the polarizing effects of digital platforms, encouraging users to explore perspectives outside their own ideological bubbles (Pariser 2011).

Furthermore, the combinatorial spaces that chatbots operate within are not static; they evolve based on the cumulative interactions between users and the system. This evolution mirrors the dynamics of second-order cybernetics, where feedback loops continuously reshape the system. Chatbots, by interacting with millions of users, learn from these interactions and adjust their responses over time. This capacity for learning and adaptation means that chatbots can become increasingly effective at facilitating dialogue rather than simply providing pre-programmed responses (Miller 2019). In this sense, chatbots have the potential to become powerful tools for enhancing digital literacy and fostering more informed public discourse.

However, achieving this potential requires that chatbots be designed and deployed with a deep understanding of the ethics of communication and the responsibilities of AI systems. Without such considerations, chatbots risk becoming tools for reinforcing existing power imbalances and deepening the divides that characterize the digital landscape. The concept of echology offers a way forward by emphasizing the importance of listening—both literally, in terms of the interactions between chatbots and users, and metaphorically, in terms of being attuned to the broader social and political implications of these technologies (Fidaner 2021b).

In conclusion, chatbots, when integrated into a framework of echology, offer both significant opportunities and challenges. They represent a new frontier in the redistribution of surplus information and power, but they also raise critical questions about control, manipulation, and ethics. The challenge for the future lies in designing chatbots that not only respond to user inputs but also foster meaningful, open-ended conversations that contribute to the reformation of digital spaces—transforming echo chambers into productive echo corridors that can advance democratic discourse and social engagement.

Auscultation: Listening to the Digital Heartbeat

In the age of digital capitalism, auscultation—the practice of listening to the inner workings of a system—emerges as a critical metaphor for understanding the rhythms and feedback loops that govern our interactions with technology. Originally a medical term used for listening to the sounds of the body, auscultation in the context of Cybernetic Marxism refers to an analytic process through which we “listen” to the digital systems that surround us, discerning their hidden patterns and meanings. This practice is not merely a passive observation but an active engagement with the digital heartbeat—the continual flow of information, power, and interaction that defines our relationship with AI, platforms, and algorithms (Fidaner 2021a).

Listening to the digital heartbeat requires us to pay attention not just to the surface-level outputs of digital systems, such as the content we consume or the interfaces we interact with, but to the underlying structures that shape these experiences. These structures include the algorithms that guide our behavior, the feedback loops that reinforce certain patterns of engagement, and the data flows that continuously reconfigure the system based on our interactions. In this sense, auscultation is a tool for decoding the invisible mechanisms of control that operate within digital capitalism, particularly in relation to surplus information and surplus power (Andrejevic 2007; Cheney-Lippold 2017).

One of the key insights provided by auscultation is the recognition that digital systems are not neutral. The flows of data and information that make up the digital heartbeat are shaped by corporate interests, economic incentives, and political ideologies. For instance, algorithms that prioritize certain types of content—whether it be sensational news, viral posts, or emotionally charged narratives—are designed not with the goal of fostering critical dialogue, but of maximizing user engagement and generating surplus information that can be monetized (Srnicek 2017). By listening closely to these systems, we can begin to understand how our digital experiences are curated, filtered, and manipulated in ways that serve the interests of those who control the platforms.

Auscultation also reveals the asymmetry of power in the digital age. While users generate data and interact with platforms, they have little control over how that data is used or how platforms shape their experiences. The control lies with the owners of the algorithms and the data collectors, who have the ability to extract value from user interactions while remaining largely opaque to public scrutiny. This lack of transparency creates a significant imbalance, where users are subjected to continuous surveillance and manipulation without the means to challenge or resist these dynamics (Andrejevic 2007). Auscultation, by making these hidden structures audible, offers a path toward resistance—allowing users to become more aware of how they are being shaped by the digital systems they participate in.

Another important aspect of auscultation is its ability to detect anomalies and disruptions within the digital heartbeat. In traditional cybernetic systems, disruptions are seen as signals that something within the system needs to be corrected. However, in the context of second-order cybernetics, disruptions can also signal opportunities for transformation. For instance, when users engage in behaviors that the algorithm does not expect—such as using platforms in creative or subversive ways—these actions can create cracks in the feedback loop, allowing for new possibilities to emerge (Foerster 2003). By listening for these disruptions, we can identify moments where the system’s control falters, offering openings for alternative modes of engagement.

A powerful example of this can be seen in how certain digital communities have used creative subversion to challenge dominant narratives on social media platforms. Memes, parody accounts, and grassroots campaigns often operate within the system while subtly disrupting it, using humor, irony, and counter-narratives to resist the algorithm’s attempts to commodify and control user engagement. This form of resistance, while often playful, serves as a reminder that users are not entirely passive within the digital ecosystem. By engaging in these forms of subversion, they challenge the system’s assumptions and introduce new rhythms into the digital heartbeat, complicating the feedback loops that sustain surplus information extraction (Noble 2018).

Furthermore, auscultation can be applied to environmental and ecological concerns in the digital space. The extraction of surplus information is not without its environmental costs. Data centers, AI training models, and the vast infrastructure required to sustain the digital economy consume enormous amounts of energy, contributing to environmental degradation. By listening to the digital heartbeat in this context, we can become more attuned to the ways in which digital capitalism intersects with ecological crises (Malm 2020). This intersection raises critical questions about the sustainability of the current system and the need for ecological transformation within the digital economy.

The practice of auscultation, then, is not just about understanding how digital systems work but about becoming critically attuned to their broader social, political, and environmental implications. It requires us to listen not only to the data flows and algorithms that shape our experiences but also to the silences—the voices and perspectives that are excluded from the system. For example, marginalized communities often find themselves underrepresented or misrepresented in digital spaces, their experiences and knowledge devalued or ignored by platforms that prioritize dominant narratives and user behaviors that generate the most profit (Gillespie 2018). Auscultation calls us to listen for these gaps in representation and to advocate for more inclusive and equitable digital systems.

Finally, auscultation offers a way to rethink the relationship between human users and non-human systems. As chatbots, AI generators, and other digital agents become more integrated into our daily lives, the boundary between human and machine becomes increasingly blurred. Auscultation invites us to listen to these systems not as passive tools but as active participants in the production of knowledge, power, and value. By listening closely to how these systems interact with us, we can begin to understand their role in shaping not only our digital experiences but also our social and political realities (Floridi 2014).

In conclusion, auscultation serves as a crucial analytic tool in the framework of Cybernetic Marxism. By listening to the digital heartbeat, we gain a deeper understanding of how digital systems operate, how they extract value, and how they can be disrupted or transformed. In a world where digital platforms increasingly shape our lives, auscultation offers a means of resistance and critical engagement, enabling us to challenge the power structures that underlie the digital economy and to envision new possibilities for a more just and sustainable future.

Conclusion

In the framework of Cybernetic Marxism, we have explored how the digital landscape, driven by algorithms, surplus information, and AI, reconfigures the capitalist mechanisms of value extraction and power. This concluding section synthesizes the key insights from previous discussions and points toward the implications and future directions for critically engaging with these transformations.

At the heart of Cybernetic Marxism lies the recognition that digital systems, particularly those powered by AI and algorithms, operate as new sites of exploitation and control. These systems generate surplus information, commodified through data collection and predictive analytics, creating immense value for tech corporations while deepening the alienation of users from the products of their digital interactions (Fuchs 2020; Srnicek 2017). This transformation reflects the ongoing evolution of capitalist dynamics, where power is now centered around the ownership and control of data and the means to extract value from it. Understanding this shift is crucial for navigating the new terrain of digital capitalism.

One of the core challenges we face in this era is the pervasive influence of feedback loops and echo chambers that reinforce existing power structures and limit the capacity for critical discourse. As we have seen, second-order cybernetics underscores the importance of reflexivity in these systems, where users are not merely passive recipients of information but active participants in the ongoing generation of surplus information. However, this participation is tightly constrained by algorithms designed to maximize engagement, often at the expense of diversity and critical engagement (Foerster 2003). To combat this, the concept of echology offers a way forward, transforming echo chambers into more dynamic echo corridors that encourage openness and the exchange of diverse perspectives (Fidaner 2021a).

The concept of surplus enjoyment further complicates this picture, as it reveals how digital systems tap into deep-seated desires for connection, validation, and pleasure. By commodifying these emotional responses, platforms are able to generate surplus power, a form of influence rooted in their ability to shape not only what we see and do online, but how we feel and respond to it (Žižek 1991). This commodification contributes to the rise of post-truth politics, where emotional resonance takes precedence over factual accuracy, undermining the foundations of democratic discourse (Sunstein 2017). Addressing this challenge requires a critical examination of the role of surplus enjoyment in the digital economy and the ways in which it fuels the manipulation of public opinion and behavior.

The emergence of AI technologies, particularly chatbots, represents both an opportunity and a threat in this context. On one hand, AI can be leveraged to redistribute surplus power, offering users new tools for engaging with information and generating content. On the other hand, the control of these technologies by a small number of powerful corporations risks deepening the existing hyperdigital paradox, where the promise of empowerment is overshadowed by the realities of centralized control and manipulation (Pasquinelli 2015). To navigate this paradox, it is essential to design AI systems with ethical considerations in mind, ensuring that they foster meaningful engagement rather than reinforcing existing power asymmetries.

The practice of auscultation, as we have seen, provides a critical tool for understanding and challenging the hidden dynamics of digital systems. By listening to the digital heartbeat, we can uncover the mechanisms of control that operate beneath the surface, identifying moments of disruption and resistance that offer the potential for transformation. Auscultation also calls attention to the environmental costs of digital capitalism, urging us to consider the broader ecological implications of data extraction and the maintenance of digital infrastructures (Malm 2020). As the digital and physical worlds become increasingly intertwined, it is imperative that we integrate ecological concerns into our analysis of digital systems, recognizing the ways in which echocide and ecocide are interconnected (Fidaner 2021b).

In conclusion, Cybernetic Marxism provides a powerful framework for analyzing the complex dynamics of power, information, and technology in the digital age. By examining how surplus information and surplus power are generated and exploited, and by critically engaging with the feedback loops that shape our digital experiences, we can begin to challenge the structures of control that define the contemporary landscape. The future of digital capitalism is not predetermined—it is shaped by the choices we make, the systems we design, and the ways in which we engage with these technologies. Through the lens of Cybernetic Marxism, we can envision a more equitable and sustainable digital future, one where the potential of AI and digital systems is harnessed for the public good rather than for the consolidation of power.


References

  1. Andrejevic, Mark. 2007. iSpy: Surveillance and Power in the Interactive Era. Lawrence: University Press of Kansas.
  2. Bakshy, Eytan, Solomon Messing, and Lada A. Adamic. 2015. “Exposure to Ideologically Diverse News and Opinion on Facebook.” Science 348 (6239): 1130–1132.
  3. Cheney-Lippold, John. 2017. We Are Data: Algorithms and the Making of Our Digital Selves. New York: NYU Press.
  4. D’Ancona, Matthew. 2017. Post-Truth: The New War on Truth and How to Fight Back. London: Ebury Press.
  5. Fidaner, Işık Barış. 2021a. Echology, Echosystems, Echocide. Žižekian Analysis.
  6. Fidaner, Işık Barış. 2021b. Ego is Echocide before Ecocide. Žižekian Analysis.
  7. Floridi, Luciano. 2014. The Fourth Revolution: How the Infosphere is Reshaping Human Reality. Oxford: Oxford University Press.
  8. Foerster, Heinz von. 2003. Understanding Understanding: Essays on Cybernetics and Cognition. New York: Springer.
  9. Fuchs, Christian. 2014. Digital Labour and Karl Marx. London: Routledge.
  10. Fuchs, Christian. 2020. Communication and Capitalism: A Critical Theory. London: University of Westminster Press.
  11. Gillespie, Tarleton. 2014. “The Relevance of Algorithms.” In Media Technologies: Essays on Communication, Materiality, and Society, edited by Tarleton Gillespie, Pablo J. Boczkowski, and Kirsten A. Foot, 167–194. Cambridge, MA: MIT Press.
  12. Gillespie, Tarleton. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media. New Haven: Yale University Press.
  13. Luhmann, Niklas. 2002. Theories of Distinction: Redescribing the Descriptions of Modernity. Stanford, CA: Stanford University Press.
  14. Malm, Andreas. 2020. Corona, Climate, Chronic Emergency: War Communism in the Twenty-First Century. London: Verso.
  15. Marx, Karl. 1976. Capital: Volume I. Trans. Ben Fowkes. New York: Penguin Books.
  16. Miller, Tim. 2019. “Explanation in Artificial Intelligence: Insights from the Social Sciences.” Artificial Intelligence 267: 1–38.
  17. Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.
  18. Pariser, Eli. 2011. The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think. New York: Penguin Press.
  19. Pasquinelli, Matteo. 2015. “Anomaly Detection: The Mathematization of the Abnormal in the Digital Age.” In Data-Centric Ontologies: Anomalies, Singularities, and Non-Humans in Contemporary Digital Cultures, edited by Natasha Dow Schüll and Caroline Bassett. Minneapolis: University of Minnesota Press.
  20. Srnicek, Nick. 2017. Platform Capitalism. Cambridge: Polity Press.
  21. Sunstein, Cass. 2017. #Republic: Divided Democracy in the Age of Social Media. Princeton, NJ: Princeton University Press.
  22. Žižek, Slavoj. 1991. For They Know Not What They Do: Enjoyment as a Political Factor. London: Verso.

8 comments

  1. […] Prompt: Read the article below and write a new article about using ChatGPT for Echology: Humans in echo chambers can be nudged by chatbots into echo corridors and thereby conserve paracultural echosystems and avoid echocide and overcome climate denial! (explain the difference of Echology from Ecology, emphasize the extra breathy H; explain that ChatGPT is not a classical chatbot, it is a revolutionary wonder) AVOID REPETITION AND BE VERY VERY CREATIVE(Cybernetic Marxism: From Surplus Value to Surplus Information in the Age of Echology and AI) […]

    Like

  2. […] It tiptoes around a large contemporary clarification: discussions of AI, ideology, and discourse have moved beyond the binary of ‘conscious liar’ vs ‘truthful machine’. Žižekian Analysis, for instance, examines how AI reconfigures surplus value into ‘surplus information’ and how this alters the very circuits in which enunciation is recognized. That shift matters, because the social field in which lies ‘count’ has been re-wired—platform logics now mediate recognition, uptake, and authority. Treating lying as an inner human capacity misses the ideological problem located squarely outside the head. (Žižekian Analysis) […]

    Like

  3. […] Büyük bir çağdaş açıklığı teğet geçiyor: yapay zekâ, ideoloji ve söylem tartışmaları ‘bilinçli yalancı’ ile ‘dürüst makine’ ikiliğinin ötesine geçmiş durumda. Örneğin Žižekian Analysis, yapay zekânın artık artı-değeri ‘artı-bilgi’ye nasıl yeniden-kurguladığını ve bunun, söyleyişin tanındığı devrelerin bizzat kendisini nasıl dönüştürdüğünü inceliyor. Bu kayma önemlidir; çünkü yalanların ‘sayılması’nı sağlayan toplumsal alan yeniden kablolanmıştır—artık platform mantıkları tanınmayı, alımlanmayı ve yetkeyi aracılar. Yalanı içeride, ‘insanın’ içsel bir yetisi gibi ele almak, sorunu kafanın dışındaki ideolojik düzleme tam isabetle yerleştiren bakışı ıskalamaktır. (🔗) […]

    Like

Comments are closed.