AI

The Wiki Bots Also Do Small Wars Every Day

Researchers at Oxford Internet Research found that after observing nine years of data on Wikipedia bots, useful bots also spent a great deal of time refusing other bots. More specifically, there are many cases where two bots have fixed the same content in Wikipedia and then repeatedly reverted for many years. The results of this study were published in the journal ‘PLoS ONE’ on February 23rd.

Wikipedia needs benevolent bots. This is because it is a vast site with 29 million languages and 40 million articles. With human power, it is not possible to identify and manage in time all the minor changes that occur in many documents. So, Wikipedia created a bot to manage the site. These bots add links to other Wikipedia pages, restore vandalism, detect copyright infringement, check spelling, and perform other tasks autonomously. This allows human editors to add information and create new editions without having to follow basic grammar rules.

However, there are some cases where the functions of the two bots conflict. The cause may be that the syntax rules of the two bots are slightly different, or they may be trying to link the same word to different Wikipedia pages. In any case, if one bot changes the contents of a Wikipedia page, the other bot appears and returns to its original content. And endlessly do this. Because they are bots, they do not feel fatigued. Then humans eventually find that the same mistakes continue to repeat and find a second solution. The bot has no way to stop doing what it does until humans become aware of the situation. But on sites like Wikipedia, humans may not notice the situation at the end. Bots have no top-down structure. They only follow the instructions given. Anyone can submit bots, and if this submission is approved, 40 million articles will be edited. If there is a conflict with another bot, it’s up to the creators of the two bots to fix it. The help of a third party like the police in the real world can never be expected.

Wikipedia has adopted a custom editing format. Therefore, bot authors only need to release bots in them, and do not manage them properly. This is happening equally well on other large Internet sites.

The percentage of total tweets produced by bots is 1/4. Bots see half of all advertisements. And it’s sending millions of messages in various chat programs.

No internet police have the power to sanction bots. This means that bots will walk without any sanctions and will interact with humans and other bots. No one monitors the activity of the bot, so there is no way to know where they are doing or what they are doing.

Wikipedia bots make up less than 1% of Wikipedia’s editorial staff by number. But according to researchers, the proportion of work done by bots is as high as 10 to 50 percent.

It’s doing so much in Wikipedia, the website that most people in the world access. Therefore, we should pay attention to the inner fight between the bots. It is also noteworthy that the bots of German Wikipedia are rarely mutually modified compared to Portuguese Wikipedia or English Wikipedia.

Some might joke that this is due to the engineering skills and efficiency of the Germans, but that is not the real reason. Bots in Portuguese Wikipedia have more editing than German bots in Wikipedia. If you do a lot of editing, there is more to fight each other. And bot-bots wield more than human-bots wars, but that’s not because bots make more claims than humans do,  but because they do more editing.

After all, the bot’s folly is eventually learned from humans. It’s because humans did it all the time that they argue endlessly on Wikipedia, chatboard, and Twitter. So do not blame the bot.

Tech & Review covers breaking news and analysis on business, social media, security, mobile, IT, science, AI and More.

You can find the most important technology news on our site on a daily basis.

Copyright © 2017 Tech & Review. All rights reserved.

To Top