Ukrainian flag, hacker

The whole world is a social network, and the people in it are bots, especially when it comes to the conflict in Ukraine

Read Time:7 Minute, 12 Second

I am sure that many of you have had as favorite pastime arguing about something with someone in social networks proving to all users your point of view, which for you was fair and the only true opinion. Your humble servant, the author of this article, loved such entertainment. He began his journey back on the MySpace, and in 2004 he started to actively use the newly created Facebook becoming one of the first 50,000 users of this epochal project. Back then, social networks were truly a territory of freedom. Being a student and having a lot of free time, the author could devote a lot of time to maintaining his own communities, actively discussing in the comments with almost every opponent personally. This created a very private and democratic environment where attention was paid to every opinion, and the social network atmosphere gave birth to a real cross-section of public opinion diversity. At that point, a whole layer of quite independent and interesting bloggers emerged, relying on their own uniqueness. 

Alas, the power of social networks was very quickly appreciated by serious players in the information market. Now large communities, which own the minds of the audience are large media projects, behind which is a team of professional journalists, specialists in content, advertising and SMM. Their sponsors and customers spend huge resources on such «communities». The most powerful «whales» of such a market can rely directly on the owners of social networks who can regulate the agenda through their own repressive levers of moderation. Even a very talented author who can devote a lot of time to his project is doomed to the attention of only a very narrow audience. Feedback as such has almost disappeared — comments on publications have either been limited or lost in a multitude of inarticulate or fragmented messages.

In today’s social networks the second component of the «information dictatorship» becomes the issue of comments on certain publications. Probably every one of you, participating in an Internet discussion, has lost motivation when seeing that an overwhelming number of other commenters do not share your opinion. Even the author, who has always considered himself the bearer of a developed and independent position often experienced this feeling, and under the pressure of public opinion he had to admit that he was «wrong». This effect was quickly noticed by the major information players as well. It became obvious that people were not expressing their opinion on unfamiliar content. Although if a user was simply reading the news, which he could not comment on, he ignored it perceiving as information provided by «big brother», or even as absolutely opposed to the content that its creator intended it to contain. Restricting comments under posts did not solve the task of «brainwashing». 

The study of «correct» comments, which at the dawn of the social networks represented the real voice of the people, was already perceived by the user as the truth. This gave rise to a veritable industry of bots, which began to strenuously shape the right «public opinion». The institutions that created these bots and promoted the desired agenda through them became very diverse. At the bottom of the «food chain», there are small groups of 3-5 people who leave positive feedback about certain local brands or products for potential customers. Currently, at the top of this «kingdom of bots», entire corporations with thousands of employees have modern technological means to organize their work and focused on working off a global information agenda around the world.

Now there is a paradoxical situation where most of the comments are left not by «live» users, but by bots. Incorrect conclusions and opinions are formed, alas, by real people, not virtual ones. This brutal but effective system of dictatorship, which started with the promotion of low-quality goods, very quickly made its way into politics, where it began to be used completely everywhere. The price of this deception was no longer loss of prosperity, but human freedom and lives. Every year the mechanisms of distorting public opinion through bots became more and more perfect and sarcastic. In 2022, the culmination of these processes was the media coverage of the conflict in Ukraine. This was a good reason to investigate the tools and sociology of «bot breeding». 

The attention of researchers of the Internet space was attracted by the social network Twitter, which became famous as the most prone to «attracting» the services of bots. In July, the head of Tesla and SpaceX, Elon Musk, turned down a deal to buy Twitter precisely because the social network did not provide information about fake accounts and spam bots. Twitter «failed or refused to respond to multiple requests» for data on spam and fake accounts. The scientists’ choice «met» their expectations. The main object of an Australian Internet sociologists’ study was the distribution of competing hashtags #IStandWithPutin and #IStandWithUkraine, as well as other similar hashtags. The outcomes of the work resulted in a scientific publication «#IStandWithPutin versus #IStandWithUkraine: The interaction of bots and humans in discussion of the Russia/Ukraine war».

Scholars from the University of Adelaide in Australia studied 5.2 million messages published between February 23 and March 8. All messages sent during the period of hostilities in Ukraine were in the scope of the studies. Clear figures were used to prove the hypothesis that bots dominated over live users in the segment of acute political issues. In the end, it turned out that from 60% to 80% of publications on the Russian-Ukrainian war are written by bot accounts. Another interesting figure showed that out of these 80% bots, almost 90% took a pro Ukrainian stance in their comments.

This does not mean that Russia is not using bots, but it does show that «bot farming» is much more advanced in Europe and the United States than it is in Russia. After all, it is obvious that Ukraine itself would not have been able to organize such a flow of bot work and used the services of its external partners and contractors. This «technological lag» in Russia has created a picture in which real pro-Russian users are able to influence more users on Twitter than real pro-Ukrainian users. For users in the US and Europe, this gives rise to other disappointing conclusions — in any social, public, and political discussions, such as support for certain initiatives or elections, the influence of bots can be just as extensive. Comments on social networks, meanwhile, have nothing to do with the real opinion of the population, and rigidly and aggressively shape the position of every citizen.

Alas, such pressure pays off. In the qualitative part of the study, the Australians used the example of the Russian-Ukrainian conflict to show how bots evoke emotions in people and manipulate their minds. The paper noted that the bots evoked anxiety using words that could be boiled down to the concept of «ambient fear and anxiety.» Among them were words such as «move», «leave», «escape», which are potentially associated with fleeing the country. It is safe to say that the bots influenced people’s decisions about whether to leave their homes or not. The work of the bots has had real consequences, with many Ukrainians fleeing the country to Europe in fear of violence and disaster creating a localized refugee crisis in European countries. The same kind of «pressure» can be conducted in any other problematic situation, such as price increases, abortion, carrying firearms, and so on.

The real user is either unable to see the difference between a bot and a real person, or, even if the status of «bots» is obvious, its intuitively positions are as real. The user finds himself in a fantastical «Matrix,» where he is a hostage of its organizers. How to combat this? The question is quite non-trivial. The author of the article once adored social networks, but now goes to Facebook not more than once a week, does not use Twitter at all and limits communication and exchange of point views with good friends in messengers. In the 2000s, social media rediscovered the world with its possibilities and breadth, but what it has become is a reversal of history. In the movie «Terminator,» it was life-threatening to confuse a human and a robot with a human skin. It is now dangerous for freedom of opinion to confuse the comments of a bot with those of an alive user. For now, the best way out remains total skepticism of such «opinions on the Internet» and trusting only the trusted and true «offline» people.

Happy
Happy
50 %
Sad
Sad
0 %
Excited
Excited
50 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %
flags of Ukraine and Palestine Previous post What the conflicts in Ukraine and Palestine have in common, what is different, and why, despite the similarities, for many in the media they are «very different”
German map with gas cylinder Next post The long history of the German gas journey: from Norway via Qatar to Canada. Will this end successfully?