Abstract. The rapid development of the platform economy not only leads to changes in the organizational forms of the economy, but fundamentally transforms social relationships on the macro- and micro levels, directly affecting a major subject area of sociology. The extraction of big data by platforms makes it possible to study data on human behavior that were previously hard to access, but requires new analytical approaches. As a result, sociology is facing major substantive and methodological challenges. Not only is there a need for more effective analysis of the fast-changing object of research, but at stake is the preservation of sociological theory as a body of general explanatory schemes amid ever louder calls to replace it with computational social science. Drawing on the latest specialized literature, the author traces the logic of platform capitalism as a new method of economic integration, examines the emerging hybrid forms of exchange, reveals the contradictory nature of the new social networks, explores what is happening to the rationality of human behavior, and analyzes how social relationships are changing on different levels.

The Platform Offensive

We are witnessing rapid growth of the platform economy. As of 2022, according to Forex, the 10 most valuable companies in the world in terms of market capitalization included five platform companies, with four of them in the top five (Apple Inc., Microsoft, Alphabet Inc./Google, Amazon Inc., Meta Platforms Inc./Facebook), whereas a year earlier, two other platform corporations (Tencent, Alibaba Group) were also in the top 10. Five of the most valuable brands in the top 10 worldwide belong to platform companies, with four of them in the top five (Apple, Amazon, Google, Microsoft).

An explosive growth of big technological platform companies occurred in the 2010s. Apple moved to first place in terms of capitalization in 2011, Alphabet/Google made the top 10 in 2013, and Amazon and Facebook only did so in 2016. The earnings of the leading platform companies in the 2010s were growing at an annual rate of 25% to 50% amid a world economic growth rate of a mere 2% to 3%. By the beginning of the 2020s, Google and Facebook accounted for more than a third of all global spending on advertising, YouTube hosted almost three-quarters of all world online videos, Amazon Web Services delivered almost half of all cloud services, Apple sold one in four smartphones in the world, and Uber controlled two-thirds of all taxi calls in many countries. In the US economy at large, about 70% of the services sector is under the influence of platforms, with half of that share belonging to the sphere of direct effects [12, p. 1457].

Such swift growth of platform companies is due to the rapid development and growing affordability of new technologies, with technical memory volume doubling every three years; the emergence of cloud services that enable quick scaling of operations; the introduction of artificial intelligence as a recursive self-learning system; and the rapid spread of network-enabled mobile devices. From an economic standpoint, this was aided by relatively low credit rates and the availability of surplus capital for comparatively long-term investments. The corporate management model chosen by the big tech companies also played a role: their owners introduced a two-class structure of shares, which offers total control of the company and the opportunity for aggressive takeovers, which greatly accelerated expansion. As a result, by 2022, according to Forex, Google chalked up 236 takeovers, Microsoft – 225, Apple – 109, and Facebook – 79. Liberal anti-monopoly legislation provided a favorable backdrop, especially in the US, the cradle of most leading platforms.

As a result of the onslaught of platforms, industries are being transformed on a large scale, and traditional industry organizational forms are being replaced in many areas, including in Russia. Google and Yandex are taking the place of traditional media; marketplaces (Amazon, Wildberries) are making inroads in traditional retailing; YouTube and Netflix content is replacing television; social networks (Facebook, VKontakte) offer new forms of communication; Uber and Yandex.taxi are replacing taxi pools; Airbnb successfully competes with hotel chains; online education platforms (Coursera, EdX) are trying to gain ground on universities; and video conference providers (Zoom, Webinar) are stimulating the vacating of office space. New hybrid organizational forms are appearing, for example, in the shape of a symbiosis of major platforms and small businesses, which offers greater access to markets. For example, in Russia small and medium-sized businesses and self-employed agents account for about 90% of sellers on Wildberries, more than 85% of sellers on Ozon, and about 80% of Russian sellers on Aliexpress Russia.

The emergence of the platform economy confronts sociology with multiple challenges. How do the new organizational forms work and how stable are they? How is capitalism being transformed and in whose hands is economic power being concentrated? How is the character of market exchange changing? What is the role of new network connections? What is happening to human behavior and social relations on the micro-level? Without claiming to be definitive, this article aims to explore these questions, outlining a future research program.

The Platform as a New Mode of Integration

Let us start with the definition of a platform as a new organizational form. From a technological standpoint, the platform is a programmed digital infrastructure enabling the interaction of spatially distributed agents. More importantly, it provides the technological basis for a new market business model geared to the extraction and use of a special type of raw material – i.e., data generated by interactions on the platform. Platform companies typically do not have assets other than the ownership of software and hardware: Uber has no vehicles, Facebook does not create content, Alibaba Group has no goods depots, and Airbnb does not own real estate. It should be noted that we have defined a platform as an ideal type that real tech companies generally correspond to only in part. For example, Apple, one of the technological leaders, is not a platform in the full sense [30, p. 41].

There are different types of platforms. Innovation platforms, which include, for example, open-source software platforms, are contrasted with transactional platforms that link sellers and buyers [27]. This article focuses on the latter. Platforms can be divided into communication (Google, Facebook), industrial (General Electric, Siemens), and service (Uber, Airbnb) platforms. Sociologists are particularly interested in work platforms (Upwork, Freelancer, FL.ru), which hire not traditional workers but independent subcontractors (self-employed agents, freelancers), interaction with whom is based on computer algorithms [28; 33]. Market deals here involve one-time services (transportation, delivery, translation, project design). Some self-employed workers operate online at a global level (programmers, designers) and others work offline in a local environment (drivers, couriers). The development of such platforms has been spurred by the “surplus” population and growth of unemployment in the wake of the 2008 financial crisis, and the spread of outsourcing and remote work. All this lowers the entry barrier to the labor market and simultaneously deregulates and de-standardizes employment in the framework of the “gig” economy [5]. Workers gain considerable autonomy, but lose job security and social protection.

Formerly, institutional economic theory identified two main forms of economic integration – the market and the firm [34]. Network communication was later added as an intermediate form [20]. Platforms include elements of a firm, market, and network, but on the whole are not reducible to any of these forms, constituting a fourth form of integration [33]. The key organizing principle of a firm is command within the organization based on bureaucratic rules; for the market, it is contracting independent players outside the organization; for networks, it is cooperation and establishing stable links between independent participants; and for platforms, it is involvement understood as the cooptation of participants who are simultaneously inside and outside the platform [31].

The distinguishing feature of platforms is that they are neither buyers nor sellers, nor are they even simple market middlemen in the ordinary sense; they are “merely” data administrators. They do not create corporate hierarchies, and the rules they introduce are not implemented by the usual bureaucratic methods. Instead of vertical or horizontal links, they build trilateral relations (supplier – platform – client), replacing collaborative (network) links with more dependent (closed) links. Their monitoring of activities does not involve direct disciplinary supervision and reporting. Instead of expert or managerial control, there is user feedback. It forms the basis for ratings, relatively stable scores determined based on a certain standard and, even more important, rankings, or constantly updated scores relative to other participants without rating indicators. Online competition between suppliers or workers takes the specific form of a contest for platform visibility [31].

However, crucially, platforms assimilate a new source of power, algorithmic management, defined as the monitoring and modification of human behavior based on set computer rules with detailed structuring and automation of business processes. It is important to note that platforms involve suppliers and users in algorithmic management practices without delegating managerial powers or disciplinary control to them. For suppliers and clients alike, depersonalized algorithms represent a kind of “black box,” while periodic re-coding of the rules and unilateral changing of the criteria by which platforms assess performance is a source of additional uncertainty and dependence for them.

Platforms demonstrate a steady trend toward expansion and monopolization [30]. Competing with one another, they increasingly seek to break into contiguous markets and to lure more and more participants. This is achieved through large-scale cross-subsidizing – first of certain suppliers, then users, whereupon network effects kick in: users draw new users. The task of the platform is to make users dependent on their services and, accordingly, to impede the use of alternative products (or take over their producers). They try to create relatively closed ecosystems, luring users away from the open Internet toward increasingly closed applications, cultivating the logic of depending entirely on a platform where ideally the users are able to find all they need without leaving its confines.

Thus, platforms are a form of integration different from a firm, market, or network. They are turning into a dominant organizational form with a tendency toward monopolization, making another step in organizational evolution from capitalist factory to big corporation, then to global production networks, and finally to platforms [9].

The Logic of Platform Capitalism

In critical social theory, there is now a distinct sense that the platform economy provides an organizational basis that is not simply yet another economic pattern, but a new type of socio-economic system called “surveillance capitalism” [35; 26] or “platform capitalism” [30]. In the opinion of Shoshana Zuboff, traditional industrial capitalism is based on the processing of physical (initially natural) raw material by bringing together the labor of workers and means of production (machines, equipment, etc.). By contrast, platform capitalism inverts the production process, liberating it in many ways from the physical worker. The main raw material here is data on human behavior as a by-product of users’ searches and communication on the platform. Users are not hired workers, but by performing acts on the platforms and leaving digital footprints, they become involuntary producers of the main raw material.

Platforms create technologies of digitization, retrieval, storage, processing, analysis, and representation of behavioral data. These data are unilaterally appropriated by platforms as freely accessible raw material without notifying users or by giving them purely formal notice. The spectrum of the data collected is constantly expanding, spreading from the virtual to the real world where the location, purchases, conversations, and other forms of individual behavior are recorded, gradually including their motives and emotions. Tech companies are accumulating ever growing private shadow stocks of knowledge. Initially, these data are used to improve the quality of search and other user services and then to derive profit through targeted advertising toward specific users, which accounts for the bulk of the income of major platforms. Importantly, what is sold to advertisers is not the raw material, but a processed product in the shape of forecasts or so-called behavioral futures [35]. Even if the share of advertising profits drops in favor of paid subscriptions and micro-payments for client services [30], the process of extracting more and more client data is unlikely to stop.

Under these conditions, how do the main groups of market participants change? In traditional industrial capitalism, the key confrontations were those between managers and workers within a firm, as well as between buyers (procurers) and suppliers in a market exchange, with the former usually controlling the latter. In platform capitalism, workers (many of whom are self-employed) and suppliers are stripped of their former basic rights as controlling managers, and buyers are replaced by the programmers and customizers of algorithms. Platform users not only create value in the form of data, but they also stand in for managers and buyers in assessing the performance of workers and suppliers. As a result, a regulative coalition is formed of platforms (and the investors behind them) and the users of platforms against self-employed workers and suppliers [31]. Users support platforms not only economically, but also politically by legitimizing the new business model and protecting it from attempts at state regulation under the slogan of free access to information [24].

Hybrid Forms of Exchange

Platforms are a contradictory combination of different types of relations of mutuality (reciprocity), autonomy, and dominance [27]. Thus, a hybrid combination of non-market and market forms of exchange emerges in platform capitalism. With platform users, it is non-market exchange, when free personal access to information services is exchanged for free access to data on user’s behavior, often collected without their consent. In effect, it is not only non-market, but essentially non-economic exchange, because users have no option to withhold their personal data without compromising the quality of services – they are offered a “take it or leave it” contract. Users are not even able to delete previously created information (which continues to be stored on platforms), and platforms can alter the terms and algorithms unilaterally without notifying users. Simultaneously, platform companies engage in real market exchange by selling behavioral forecasts to advertisers and other interested parties. This warrants speaking about behavioral data as a special kind of fictitious commodity along with land, labor, and money [18; 8]. Human experience as a free raw material is digitized, appropriated, and commodified, although in reality it is not intended for sale and is not the result of labor in the ordinary sense, since the bulk of the data is produced by users unintentionally.

Because some information is personal and confidential, like in the case of other fictitious commodities, data marketization meets with resistance on the part of society and needs to be regulated by the state. Attempts have been made to protect personal information – one example being the General Data Protection Regulation adopted by the European Parliament in 2016. Monopolization is also targeted – consider the series of anti-monopoly lawsuits with billions in fines against Alphabet (Google) in the European Union. So far, however, big tech companies have been able to cope with the situation by introducing self-restrictions or pretending to backtrack. They find many ways to neutralize the resistance. Companies improve and customize services, come up with wonderful technological innovations, and argue that progress is inevitable and there are no alternatives. Changes are introduced so quickly that existing laws and ethical norms cannot keep up with them. In addition, problems arise with understanding the innovations because they cannot be described using established concepts [35].

Given all this, tech companies position themselves as heroic entrepreneurs [29], disguising their market motives with technical jargon, often acting surreptitiously and keeping their operations secret. Data collection is justified by “natural” rights to the general open access of information. Everything is done to make users dependent and to deny them the opportunity to turn down the services offered. Economic incentives are offered as scores, bonuses, and discounts. Users are lured and retained through mechanisms of gamification and “if-you-are-not-on-the-net-you-don’t-exist” rhetoric.

Alliance of Platforms and the State

Platform capitalism changes the relations between the market and the state, which develop in different directions. On one hand, the state in some ways grows weaker vis-à-vis tech companies, which, becoming the main sources of knowledge, assume the tasks of monitoring and modifying people’s behavior. These companies exploit a neo-liberal ideology to diminish state regulation and promote self-regulation of their activity. On the other hand, there is obviously growing demand in society for “the big state” [14], prompted among other things by economic crises, the coronavirus pandemic, trade wars, sanctions, and military operations.

It is significant that platform companies, in collecting behavioral data for commercial purposes, have entered into competition with the state, which claims the same resource in the interests of national security. However, the inevitable tension does not prevent the formation of tactical (public and hidden) alliances between major tech companies and the state cemented by mutual concessions and exchange of information, including on a commercial basis. In addition, tech companies try to use the state in their interests, lobbying such measures as loosening confidentiality on the Internet to clear the path for further data collection or preserving contractual relations with the self-employed to prevent them from becoming hired workers. Tech companies have also been rendering effective assistance to politicians during campaigns by more accurately predicting the outcome of elections (an example is the Cambridge Analytica scandal in the US in 2018).

A particularly favorable environment for behavioral data collection was created in the wake of the terrorist attack in the US on September 11, 2001, and a series of terrorist attacks in other countries. In the context of “surveillance exceptionalism” [35], state power dramatically shifted focus from confidentiality toward security. Law-enforcement agencies stepped up their surveillance activities, and the Internet was increasingly “militarized” as part of combating terrorism. This increased the interdependence between law-enforcement agencies and the private sector in various countries, partly circumventing legal and bureaucratic limitations. The war on terror and crime is used to justify total surveillance and the demand for prompt delivery of information about specific individuals.

Establishment of Total Surveillance

Because the accuracy of forecasts depends directly on the amount of collected and processed data, tech companies seek to extract more and more information on practically all aspects of people’s behavior. The amount of excess behavioral data accumulated by Google now covers everything in the online environment: search history, e-mails, texts, photographs, songs, messages, video, locations, communication methods, equipment, preferences, interests, facial images, emotions, illnesses, social networks, purchases, and so on [35, pp. 172-173]. Truly, we are looking at global tracking, or total surveillance.

Tracking identifiers, which make it possible to collect collateral behavioral, data become standard for new technological spheres (not only media platforms, but also, for example, telecommunications companies). User data collection continues even when key systems are switched off – you can leave the social network and the Internet, and switch off your smartphone, but the process does not stop. Besides, constant tracking spreads from the virtual to the physical environment: outdoors and indoors, we are constantly watched by a growing number of surveillance cameras which recognize faces, and personal assistants (Siri, Alice) constantly convey information about our preferences to companies. In the short-term future, with the transition to the Internet of things and the installation of more and more sensors on all kinds of material objects and on the human body in the form of gadgets and implants, data collection will become increasingly ubiquitous.

It is not just platforms and commercial companies in general that engage in surveillance; the state does it too. In China, for example, attempts have been made to introduce a social credit system as a new tool of stratification, improving citizens’ behavior – i.e., achieving not only market but also social and political results.

In spite of the constant rhetoric about the concern of the state and tech companies about personal data safety and complaints about its leakage [29], the former confidentiality standards are being loosened or even ignored, being reduced to token measures – for example, by persistently demanding clients’ consent to the use of cookies that convey information about their actions.

Tech companies claim that behavioral data are used only in a depersonalized form, but these claims raise serious doubts, and at any rate, the ordinary user does not understand the essence of technological processes that are shrouded in commercial mystery. However, experts say that meta-data can easily be deanonymized and a person can be identified from publicly available information [35]. Increasingly often we receive ads about specific goods and services we have not asked for but merely mentioned in our communication with other people.

Surveillance is turning into a commercial service: the data collected enables employers to deep-check workers about to be hired or already hired (from monitoring social networks to supervising computer screens using programs such as Teramind or InterGuard), banks to check the reliability of borrowers, and real estate owners to get profiles of tenants. We are talking not about former general formulas (for example, scoring systems), but about a detailed screening of specific candidates, while the candidates are excluded from the process and cannot influence the character of the information provided, which may contain inaccuracies or be based on discriminatory criteria.

Another important element of the emerging new relations is the deliberate destruction of the boundary between the public and private life of people in order to obtain more and more personal data. As early as 2010, Mark Zuckerberg, the founder of Facebook, proclaimed that “privacy is no longer a social norm.” The historical roots of that concept go back to the idea of Panopticon, an ideal prison proposed in the late 18th century by Jeremy Bentham and later used by Michel Foucault to develop a new theory of power based on total surveillance [7]. In our day, the scheme is supported by modern technologies that replace direct visual contact with a system of distributed cameras that instantly identify any person and easily change the observed space, thus spreading the surveillance system to any area. But the main surveillance tool is the social networks where users readily produce information about themselves, making it easier to control their behavior. In effect, social networks have inverted the Panopticon formula. The old Panopticon separated the sides in the “to see – to be seen” relationship; the social network unites these sides: everyone sees everyone else and is at the same time visible to all, thus performing the role of the supervisor and the supervised. But in demonstrating themselves to others, the participants make themselves visible to the authorities who, like in the old Panopticon, remain invisible in the darkness of the tower and are able to observe everyone at once. Only, unlike in the Panopticon, the tower is not at the center but somewhere on the side, unseen by the participants who can only guess about the existence of engaged analysts [23].

The Controversial Role of Social Networks

The concept of social networks as stable (structural) links between contracting parties was formed in sociology and related disciplines as early as the mid-20th century [19; 20]. The 2000s saw the emergence of an entirely new concept of the new social media,or online services for building, representing, and maintaining social relations [3]. How do social relations change when social networks move to online platforms?

On the whole, the role of the new social networks in shaping social relations is diverse and contradictory. All other things being equal, compared with offline, the number of contracting partners and density of network connections online grows and the intensity of contacts increases. At the same time, the links become more heterogeneous, the average strength of contacts diminishes, they are less self-contained in cohesive cliques, and connections become more superficial and less durable. On one hand, networks become a tool of active individual representation and an effective means of spreading information about oneself. On the other hand, the boundary between the public and the private is voluntarily erased, with networks becoming the main source of information about a person without his/her knowledge or consent. Social networks drastically democratize the production of content, enable its collaborative production, and offer opportunities for remote work, the sharing economy, and the formation of virtual communities. At the same time, we see an obvious degradation of the bulk of the content produced, a shortage of live contacts, and a trend toward self-isolation. People’s growing involvement in non-stop communication becomes addictive, breeding an inability to switch off and social apprehension (fear of missing out on information (FOMO)) and various depressive states.

Social networks greatly broaden opportunities to take part in social life, make it possible to quickly mobilize public opinion, but at the same time increase social pressures on the individual and launch massive mechanisms of social comparison, that particularly affect the young generations [35; 21]. Ethical problems mount. On the one hand, network users have an unheard-of freedom of self-identification, and on the other hand, although all actions leave digital footprints, the online world reveals an alarming dilution of ethical norms due to a sense of impunity of words and actions. Network participants become victims of trolling and bullying, not to speak of new forms of fraud. The increase in the number of conflicts is compounded by the increasing difficulty of resolving critical (conflict) situations. Classical procedures for resolving such situations (mutual criticism, the search for equivalence and compromises) presuppose that people discuss them among themselves [2]. Today, we see the mechanism of civilized face-to-face conflict resolution breaking down as conflicts spill out onto social networks (public space) without any discussion, mobilizing public opinion. The result is often the quick rise (multiplication) of hatred without the participants immersing themselves in the essence of the conflict, thus making it more difficult to extinguish.

Behavioral Engineering and New Rationality

Because the logic of the platform economy is in many ways built around the study of people’s behavior, its growth has immediate implications for sociology and its prospects. Several important trends need to be noted. First, the study of behavior is moving from specialized periodic studies to the constant collection of big data formed as a by-product of daily activities regardless of the researcher’s intentions [10; 16]. Second, sample observations on which sociology used to be based are giving way to blanket (total) observation. Third, it is now possible to move from social typologies and statistical aggregates (another hallmark of sociology) to observing the behavior of every concrete individual. Fourth, we see a movement from scientifically grounded suppositions (sociological hypotheses) to precise (calculated) knowledge of behavior. Fifth, we are asked to transition from general theories to predicting demands and behavior practically without any theory. Finally, along with the analysis of behaviors, an opportunity presents itself to influence behavior. And this is not the former (“carpet”) influence on behavior (for instance, through traditional advertising) but targeted (selective) influence. In other words, we are urged to move from scientific (including sociological) research to tracking by collecting big data and further to behavioral engineering.

Since the most reliable way to improve the accuracy of predictions is to go beyond tracking and to modify behavior toward a guaranteed outcome, tech companies arrogate the right to modify others’ behavior and change it unbeknownst to users to make a profit – secretly, on a large scale, and in the absence of any social and legal constraints [35].

Mechanisms for influencing behavior also change drastically. First, in a situation of total surveillance, in terms of Michel Foucault [7], space-related technologies draw people into specific “disciplinary regimes” when they begin to reproduce expected actions independently, even when they are not being watched. Second, former coercion or cultivation of norms gives way to nudging associated with the name of Richard Thaler, the herald of libertarian paternalism and a Nobel Prize winner. Nudging is inducing people to change their behavior without formal or visible restrictions and without resorting to physical or normative coercion. It involves the purposeful structuring of situations to obtain a desired outcome. Nudging can be effected by a variety of means, including: fine-tuning the search, insistent recommendations, and the manipulation of information; appealing to economic interest by offering individual bonuses or discounts; mobilizing social networks; exploiting gaming technologies; and using persuasion and conveying emotional states (encouraging sympathy). Nudging applies to any aspect of the decision-making process, which leads people to modify their behavior in a certain way without visible coercion [32].

The ideas of nudging are rooted in the insights of behavioral economics, which, unlike the mainstream economic theory assuming that human behavior is rational, argues that human thought and behavior are often irrational, fraught with mistakes and cognitive hiccups, and in need of adjustment [11]. From that viewpoint, free choice can be interpreted as random choice stemming from insufficient knowledge and an inability to calculate one’s own future. Nudging toward the desired choice (presumably the best for the person nudged) makes behavior more stable and predictable – i.e., more rational.

This brings us to another issue important to sociology – namely, the formation of a new rationality. To begin with, under the conditions of the dominance of machine algorithms, an attempt is being made to renounce substantive rationality based on differentiated values and norms in favor of formal, strictly instrumental (practical) rationality gravitating toward the universalism that sociologists have objected to so vehemently. But things do not end there, since formal rationality also undergoes substantial change. In place of the “old” individual rationality based on free choice in one’s own interest (even if misguided) comes supra-individual rationality based on algorithmic solutions. This new rationality overcomes uncertainty and subjugates behavior to the commercial interests of the market. As a result, whereas formerly the question about how appropriate it is to label humans as Homo Economicus was the subject of academic sociological debate, today humans are glibly inserted into this model, and hundreds of dollars are earned each year at the expense of every active user [4, p. 170].

Another important consequence on the macro level has to do with the replacement of human autonomy by external control. At first, machine algorithms are adjusted to humans and their interests, making choices more effective and comfortable, but then humans are put in information bubbles [17], and algorithms then gradually adjust humans to fit the algorithm, offering what they give the majority and starting to mix others’ interests into their choice. A nudge toward an optimum choice may easily be transformed into sludge (the creation of difficulties for such choice), to use Thaler’s terms. Thus, the autonomy of the individual as independence from others and the ability to create one’s own meanings and make one’s own choices is replaced by heteronomy (external regulation), and free choice becomes largely an illusion, being replaced with prescribed choice.

In turn, human will, or the ability to act contrarily, as an inalienable element of autonomy, is replaced by reinforcement and nudging, removing extra tension and making life more comfortable inasmuch as it liberates humans from extra effort and the burden of uncertainty. As a result, most people voluntarily (albeit with constant nudging) delegate their right to autonomy and free choice to automatic systems, trading this right for free access to information, customized services, ease of communication, and coveted certainty.

The aforementioned blurring of the boundary between the public and the private deals another major blow to human autonomy, which does not exist without privacy and the possibility of seclusion (physical and spiritual). A paradox emerges: while formerly economic sociology fiercely upheld the idea of the social embeddedness of human actions [22], today it looks as if it will have to fight for human autonomy.

In general, we are looking at another attempt to disembed the market in its constant quest of being “decoupled” from social relations, something Karl Polanyi pointed out in his time [18]. This impulse has never been accidental. Today, too, it exploits the very real systemic demand for individualized consumption and the accumulated weariness of institutional restrictions. The market is launching its offensive amid rhetoric about liberating daily life from outdated institutions, “throwing off the fetters,” and embracing the world of advanced technology.

At the end of the day, any power seeks monopoly and totality. From this perspective, social institutions and ethical norms are seen rather as unnecessary barriers and friction forces. Moreover, institutions (formal and informal) do not keep abreast of the rapid development of technologies and business models, so the activity areas of big tech companies are often unregulated [24]. But the existing laws and ethical norms are proclaimed to be “hopelessly obsolete.” Thus the rules that formerly were the result of a long evolution and complicated deals between various interest groups are increasingly being replaced by a different kind of rules – i.e., not very transparent algorithmic calculations – prompting concerns about the growing threat of algocracy [6].

Toward New Post-Social Relations

Back in the mid-1980s, long before the advent of platform capitalism, the advocates of “science and technology studies” in the framework of actor-network theory argued that material objects (things, artefacts) become actors along with human beings [15]. This implied a change of attitude toward social relations: “The influx of object worlds coincides with changing patterns of human relatedness that can be glossed by the notion of postsocial forms. Postsocial forms include object-relationships where the objects are non-human entities” [13, p. 445].

There is a logic to the invasion of things in social relations: it progresses from mediating the relations of living beings down to physical merger with them, then to interaction with living entities, and finally to the gradual replacement of human relationships. At the first step, as a result of the rapid progress of the digitization of daily life, all behavior and all social relations are or soon will be mediated by computers. But technologies are not a mere passive instrument or an external factor. That is why in the second step, electronic devices and sensors are woven into the very fabric of day-to-day life, which increasingly happens in virtual worlds and are built into the very human body. In this computerized world, social relations turn into a code and return to people after being filtered by machine algorithms.

The third step is when the mediation of relations is followed by direct interaction between living and non-living entities. With the introduction of robots (voice assistants, chatbots, robot operators) equipped with self-learning artificial intelligence, interaction between human and non-human entities becomes increasingly widespread and complicated, which may be seen as “artificial sociality” [25]. Here it is increasingly difficult to discern the living from the non-living because non-living objects become more and more autonomous and are getting better and better at imitating living entities. This is not the mere transposition of habitual modes of interaction, but the emergence of new iterational processes of dealing with non-living beings [1].

Machines are in many ways superior to humans; they think faster and unlike most people, they learn from the mistakes of others. In an automated system, relations between living people are more often than not seen as “friction” that puts annoying extra barriers in the way of data assimilation and introduces added uncertainty due to the inconsistency of human actions and insufficient knowledge. The desired certainty is achieved with the help of intellectual algorithms. Then comes the fourth logical step: from the simple automation of social processes to the substitution of social relations by machine processes, prompted, on the one hand, by the profit motive and, on the other hand, by the motive of social control. Machine relations begin to aspire to be the key model of social relations. While previously the behavior of machines (robots) was constructed to fit humans, now human behavior is beginning to be designed to fit machines [35].

With the growing power of machines, the need to trust people diminishes, since it has inevitable elements of uncertainty and risk (relations between humans are volatile and fickle). There emerges a new and seemingly stronger form of embeddedness of relations in intellectual algorithms in which there is little room for traditional sociology. All this confronts sociology with a number of new questions stemming from the need to understand the character of new post-social relations and project its own future in the changed conditions.

Conclusion

Sociology today is facing major substantive and methodological challenges. The rapidly developing platform economy not only creates a new economic pattern but claims to be the leading form of society’s socioeconomic system. The new system of platform capitalism seriously changes the nature of social relations not only on the macro- but also on the micro level. The first challenge is the need for effective analysis of the fast-changing object of study.

At the same time, the extraction of big data creates many opportunities for sociology, providing access to the study of data on human behavior that were previously hard to access. But this creates the second challenge, calling for new analytical approaches. The third challenge is that the very survival of sociological theory as a body of explanatory schemes is at stake. There are ever more insistent calls for replacing traditional sociology with computing or case-based social science. Finally, the fourth challenge stems from the dilution of the very object of sociological studies – i.e., social relations in the forms we are used to. This raises the question of new pragmatic functions of sociology as a professional discipline. Hopefully, sociology will rise to at least some of these challenges.

References

1. Abramov R. N., Katechkina V. M. Social Aspects of Human-Robot Interaction: Experimental Research Experience. Zhurnal sotsiologii i sotsialnoy antropologii (= Journal of sociology and social anthropology). 2022. Vol. 25. No. 2, pp. 214-243. (In Russian.) DOI: https://doi.org/10.31119/jssa.2022.25.2.9

2. Boltanski L., Thevenot L. Critique and Justification of Justice: Essays on the Sociology of Worlds. Moscow: NLO, 2013. (In Russian.)

3. Boyd D. M., Ellison N. B. Social Network Sites: Definition, History, and Scholarship. Journal of Computer-Mediated Communication. 2008. Vol. 13, pp. 210-230.

4. Chereshnev E. Life Form No. 4: How to Stay Human in the Age of Flourishing Artificial Intelligence. Moscow: Alpina Publisher, 2022. (In Russian.)

5. Crouch C. Will the Gig Economy Prevail? Ekonomicheskaya sotsiologiya (= Journal of economic sociology). 2019. Vol. 20. No. 4, pp. 70-77. (In Russian.)

6. Danaher J. The Threat of Algocracy: Reality, Resistance and Accommodation. Philosophy and Technology.2016. Vol. 29, pp. 245-268.

7. Foucault M. Discipline and Punish: The Birth of the Prison. Moscow: Ad Marginem, 1999. (In Russian.)

8. Grabher G., Konig J. Disruption, Embedded. A Polanyian Framing of the Platform Economy. Sociologica.2020. Vol. 14. No. 1, pp. 95-118.

9. Grabher G., van Tuijl E. Uber-Production: From Global Networks to Digital Platforms. Environment and Planning A: Economy and Space. 2020. Vol. 52. No. 5, pp. 1005-1016. DOI: https://doi.org/10/6092/issn.1971-8853/10443

10. Guba K. S. Big Data in Studies of Science: New Research Field. Sotsiologicheskie issledovaniya (= Sociological studies). 2021. No. 6, pp. 24-33. (In Russian.) DOI: 10.31857/S0132162250013878-8

11. Kahneman D. Thinking, Fast and Slow. Moscow: AST, 2014. (In Russian.)

12. Kenney M., Bearson D., Zysman J. The Platform Economy Matures: Measuring Pervasiveness and Exploring Power. Socio-Economic Review.2021. Vol. 19. No. 4, pp. 1451-1484. DOI: 10.1093/ser/mwab014

13. Knorr-Cetina K., Bruegger U. The Market as an Object of Attachment: Exploring Postsocial Relations in Financial Markets. Western Economic Sociology: Handbook of Contemporary Classics. Ed. by V. Radaev. Moscow: Rosspen, 2004, pp. 445-468. (In Russian.)

14. Krastev I. Is It Tomorrow Yet? Paradoxes of the Pandemic. New York: Penguin, 2020.

15. Latour B. Reassembling the Social: An Introduction to Actor-Network Theory. Moscow: HSE, 2014. (In Russian.)

16. McFarland D. A., Lewis K., Goldberg A. Sociology in the Era of Big Data: The Ascent of Forensic Social Science. American Sociologist.2015. Vol. 47. No. 1, pp. 12-35.

17. Pariser E. The Filter Bubble: What the Internet is Hiding from You. London: Viking, 2011.

18. Polanyi K. The Self-Regulating Market and the Fictitious Commodities: Labor, Land and Money. THESIS. 1993. Vol. 1. No. 2, pp. 10-17. (In Russian.)

19. Powell W., Smith-Dorr L. Networks and Economic Life. Ekonomicheskaya sotsiologiya. 2003. Vol. 4. No. 3, pp. 61-105.

20. Powell W. W. Neither Market nor Hierarchy: Network Forms of Organization. Research in Organizational Behavior.1990. Vol. 12, pp. 295-336.

21. Radaev V. V. Millennials: How Russian Society Is Being Changed. Moscow: HSE, 2020. (In Russian.)

22. Radaev V. V. Once Again on a Subject of Economic Sociology. Sotsiologicheskie issledovaniya. 2002. No. 7, pp. 3-14. (In Russian.)

23. Radaev V. V. Watching Movies, Exploring Life: 19 Sociological Essays. Moscow: HSE, 2021. (In Russian.)

24. Rahman K. S., Thelen K. The Rise of the Platform Business Model and the Transformation of Twenty-First-Century Capitalism. Politics and Society. 2019. Vol. 47. No. 2, pp. 177-204. DOI: https://doi.org/10.1177/0032329219838932

25. Rezayev A. V. (Ed.) From Artificial Intelligence to Artificial Sociality. Moscow: VTsIOM, 2020. (In Russian.)

26. Safronov E. E. Transformation of Capitalism in the 21st Century: “Surveillance Capitalism” Concept by Shoshana Zuboff. Sotsiologicheskie issledovaniya. 2021. No. 4, pp. 165-172. (In Russian.) DOI: 10.31857/S0132162250010848-5

27. Schüßler E., Attwood-Charles W., Kirchner S., Schor J. B. Between Mutuality, Autonomy and Domination: Rethinking Digital Platforms as Contested Relational Structures. Socio-Economic Review. 2021. Vol. 19. No. 4, pp. 1217-1244. DOI: 10.1093/ser/mwab038

28. Shevchuk A. V. From Factory to Platform: Autonomy and Control in the Digital Economy. Sotsiologiya vlasti (= Sociology of power). 2020. Vol. 32. No. 1, pp. 30-54. (In Russian.) DOI: 10.22394/2074-0492-2020-1-30-54

29. Smith B., Browne C. A. Tools and Weapons: The Promise and the Peril of the Digital Age.Moscow: Alpina Publisher, 2021. (In Russian.)

30. Srnicek N. Platform Capitalism. Moscow: HSE, 2019. (In Russian.)

31. Stark D., Pais I. Algorithmic Management in the Platform Economy. Ekonomicheskaya sotsiologiya. 2021. Vol. 22. No. 3, pp. 71-103. (In Russian.) DOI: 10.17323/1726-3247-2021-3-71-103

32. Thaler R., Sunstein C. Nudge. The Architecture of Choice: Improving Decisions about Health, Wealth and Happiness. Moscow: Mann, Ivanov and Ferber, 2017. (In Russian.)

33. Vallas S. P., Schor J. B. What Do Platforms Do? Understanding the Gig Economy. Annual Review of Sociology. 2020. Vol. 46, pp. 273-294. DOI: https://doi.org/10.1146/annurev-soc-121919-054857

34. Williamson O. E. Vertical Integration of Production: Thoughts on the Market Failures. Theory of Firm. Ed. by V. M. Galperin. St. Petersburg: Ekonomicheskaya shkola, 1995, pp. 411-442. (In Russian.)

35. Zuboff S. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Moscow: Institut Gaidara, 2022. (In Russian.)