News Net Nebraska

Complete News World

Wikipedia is an unreliable source?  In the future, this may not be the case, thanks to artificial intelligence

Wikipedia is an unreliable source? In the future, this may not be the case, thanks to artificial intelligence

The AI ​​system distinguishes less reliable sources. What happens to Wikipedia or what will happen in the future?

Article published on Natural machine intelligence He talks about the work of artificial intelligence that is able to distinguish between reliable sources and inconsistent sources. Wikipedia is considered one of the most creative encyclopedic portals in the world. The result of continuous review by its users. So we asked ourselves what would happen if this precise work was done by AI instead of users.

Wikipedia and artificial intelligence: what’s happening? -roma-news.it

So let’s find out in this article what would happen if artificial intelligence replaced humans in selecting reliable sources for preparing media articles.

Wikipedia and artificial intelligence

Developed by Samaya, a London-based company Artificial intelligence project, The name of the thing sideWhich consists of a system of neural networks. Neural networks are basically statistical models that automatically process data and their operation is inspired by the human nervous system.

SIDE enables it Distinguish the reliability of the sources mentioned at the bottom of Wikipedia articles. In fact, the work of this algorithm allows you to immediately select the sentences mentioned in the article and consider only those that actually belong to the cited sources, and therefore whether they are accurate or questionable and, therefore, whether the quotes given in the article are true.

Reliability of Wikipedia sources: this is what AI found -Roma-news.it

The system developed by the English company is structured by harmonizing data based on very high decision-making criteria. In fact, the algorithm allows you to analyze and Ranking of the most accurate Wikipedia entrieswhich is then marked with a bronze star at the top right, if compiled with greater accuracy, impartiality, and completeness.

See also  The New York Times is suing OpenAI and Microsoft for using copyrighted material

After the training phase, scientists asked SIDE to check articles that had never been analyzed until then. The AI ​​system then analyzed the original sources and re-suggested the most truthful sources according to pre-defined criteria. Finally, the outcome was subject to the unquestionable judgment of humans.

According to human judgment, in more than half of the cases, The selected result was consistent with the source in question, while for the first 10% of citations they are marked as It cannot be verified With SIDE, the system then suggested alternatives considered recommended in 70% of cases by the interviewees.

Users showed that they preferred the first option provided by SIDE twice as much as the current Wikipedia quote. According to the researchers, the results are more than positive because they give the possibility of systems of this type Work side by side with men To make resources like Wikipedia more reliable.