ASTRID Guerre cognitive - Accompagnement spécifique des travaux de recherches et d'innovation défense - appel thématique Guerre cognitive

Tracking and detecting fake news & deepfakes on Arab social networks – TRADEF

Submission summary

The 4th Generation Warfare (4GW) is known as information warfare involving non-military populations. It is carried out by national or transnational groups that follow ideologies based on cultural, religious, economic or political beliefs with the aim of creating chaos in a targeted part of the world. In 1989, the authors of an article on the fourth generation warfare, some of whom are military, explained that the fourth generation warfare would be widespread and difficult to define in the decades to come. With the advent of social media, the blurring battlefield has found a place for 4GW. Indeed, one of the points of penetration of 4GW is the massive use of social networks to manipulate opinions. The objective is to prepare the opinion of a part of the world to accept a state of affairs and to make it humanly acceptable and politically correct. Like fourth generation warfare, cognitive warfare aims to blur the mechanisms of political, economic, religious, etc. understanding. The consequence of this action is to destabilize and reduce the adversary. This cognitive war therefore targets the brain of what is supposed to be the enemy. Eventually, the new ill-defined battlefield in 4GW moves into the opponent's brain or more specifically into the opponent's subconscious population. This war aims to alter reality, among other things, by often flooding the opponent's population with misinformation, rumors, fabricated videos or deepfakes. In addition, the proliferation of social bots now makes it possible to automatically generate disinformation in social networks. According to some sources, for the 2016 US elections, 19% of the total volume of tweets generated were due to these automatic robots. With TRADEF, we are interested in a few disinformation channels: fake news and deepfake. The idea is to detect very quickly in social networks, the birth of a fake in its textual, audio or video form and its propagation through the networks. It is a question of detecting the birth of a fake and following it over time. At any time this potential rumor is analyzed and trusted, it is tracked through social networks in the reference language as well as in different languages. The evolution of suspicious information over time will see its score evolve according to the data with which it will be confronted. The information to be tested is matched with audio or video data that may invalidate or confirm the credibility of the information. Videos that can be used as sources to denounce a fake can themselves be deepfakes. This leads us to be vigilant about examining these videos by developing robust methods for detecting deepfakes. Finally, a dimension of explainability of the results is introduced in this project. Considering the experience of the participating teams in deep learning and the processing of the standard Arabic language and its dialects, we propose to track down and identify fakes in Arabic social networks, which leads to raise other scientific challenges such as the management of code-switching phenomenom, the variability of Arabic dialects, the identification in the speech continuum of named entities, the development of neural methods for poorly resourced languages ??and the explainability of the achieved results.

Project coordination

Kamel SMAILI (Laboratoire lorrain de recherche en informatique et ses applications)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

LIA Laboratoire d'Informatique d'Avignon
LORIA Laboratoire lorrain de recherche en informatique et ses applications

Help of the ANR 295,715 euros
Beginning and duration of the scientific project: December 2022 - 36 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter