
The first pilot focused on testing FERMI’s Spread Analyser, including the capability to capture the spread of social media posts, the bot/human feature and the influence of the accounts at stake in the context of right-wing extremism in a situation where there was a hybrid operation and disinformation is a key cause of distrust and possibly radicalisation of individuals or small groups. Below is a description of the scenario and how the Spread Analyser can be of help.
Background
In autumn 2023 Russia started sending masses of asylum seekers over the border to Finland’s side. This was very organised endeavour from Russia’s side, and it continued until Finland was forced to close the border completely. It was recognised that this was/is Russia’s hybrid operation against Finland.[1] Motivations can be seen for causing chaos or at least stirring public and political debate and need for a response from the authorities. This opens possibilities for spreading disinformation targeted towards Finns but also Russians living in Finland, who are one of the main target groups of Russia’s influencing.[2] Overall, this operation was used as to harm Finland but also to support Russia’s inner information influencing, while possibly preparing for the forthcoming (in that time) presidential elections in Russia.
When the operation started, border crossings were limited and partially open. There was also a legal debate about closing the border completely versus the right to asylum on the part of the people trying to get across. As the situation did not ease, the border has been completely closed since December 2023 to this day as a result.[3]
From the 2015 refugee crisis, we have learned that such situations can cause right-wing radicalisation. Uncontrolled or such, refugee and migration flows (of African and Middle Eastern origins) towards Finland are a main trigger of crimes motivated by extreme right-wing views. For example, after 2015 multiple firebomb attacks took place on different accommodation centres in Finland.[4] The perpetrators were found not to have a background of radical thinking or action prior to the 2015 crisis. The radicalisation happened mainly through online platforms, such as internet forums and social media.[5]
We can say that there is a realistic threat of right-wing motivated crimes related to this kind of situation. Adding to this, after the acquisition of Twitter, now X, by Elon Musk, there has been a growing number of disinformation to be found on the platform.[6] Overall, there are plenty of tweets containing disinformation or hate-filled content related to the situation laid out above, which could lead to hate crimes. This scenario provides us plenty of tweets with possible pictures, URL, and other elements to test different kinds of content within the pilot.
The use of FERMI’s Spread Analyser
The Spread Analyser analyses the spread of certain tweets on X, it can distinguish between bots and human-operated accounts and it can grasp the influence thereof. The tweets need to be identified and fed to the tool one by one, while the analysis takes time depending on the depth thereof. The results show how wide certain tweets have spread and the impact their different accounts have in this. This can be very helpful when assessing the situation. Taking all of this into an account, the Spread Analyser can be helpful, although as part of a broader analysis.
Considering the scenario and the nature of the social media information landscape, it is a challenging environment for police work. Assessing potential content or crimes themselves takes huge resources, especially given that there are a lot of anonymous actors working through international platforms. The key here for the police is to be able to assess when a certain information campaign or such might have an impact on crime occurring. FERMI’s Spread Analyser can be of help in this. As for the pilot scenario, it helps that the campaign/operation is clearly targeted at Finland and there is already understanding from earlier cases how it may play out.
- Situational awareness: When the situation (pilot 1 scenario as described above) has been analysed certain measures can be taken. Understanding what kind of narrative and disinformation content are spread helps law-enforcement agencies understand possible perpetrators (in this regard, identifying human-operated accounts and the most influential such accounts through the Spread Analyser can be crucial) and victims, while it also helps with planning communication campaigns. This is an ongoing process offering input for the following steps.
- Communication from authorities and leaders are key. Police plays their part in this; the prevention campaigns can also be operational through showing presence or having meetings with other stakeholders or groups of society.
- As there are multiple actors, agencies and stakeholders involved, having coordinated collaboration is highly important. Also, in Finland the border guard is a separate organisation from the police, so collaboration between them needs to be tight.
- Focusing resources on high potential areas and locations. As potential victims are recognised, it is easier to plan for operational measures. Especially in this scenario, the good communication and collaboration with other actors is key.
- If the situation escalates to crime, having good situational awareness and well-prepared measures are the second-best options after prevention. Understanding motives and causes for committing hate crimes helps making other potential victims safe and catching the perpetrators or preventing others doing similar ones. The spread analysis can help identify further accounts that have been used to engage in crime in due course.
Using the Spread Analyser as part of the analysis and assessing the situation in the scenario described above can help authorities embark on more powerful measures in minimising impact, preventing crime or catching and investigating potential perpetrators. FERMI tools offer capabilities to mitigate the lack of resources when it comes to Law Enforcement Agencies working constantly with a growing amount of information spread on the internet, especially on social media platforms.
[1] Jyri Lavikainen, “Russia’s hybrid operation at the Finnish border: Using migrants as a tool of influence.” Finnish Institute of International Affairs. November 2023. Available at: https://fiia.fi/en/publication/russias-hybrid-operation-at-the-finnish-border.
[2] Yle News, “Disinformation campaigns target Finland's foreign language speakers, Nato fears.” Yle. 06 July, 2022. Available at: https://yle.fi/a/3-12525251.
[3] Miranda Bryant and Lisa O'Carroll, “Finland closes entire border with Russia after tensions over asylum seekers.” The Guardian. 28 November, 2023. Available at: https://www.theguardian.com/world/2023/nov/28/finland-closes-entire-border-with-russia-after-tensions-over-asylum-seekers.
[4] Saija Nironen, ”Tuli on roihunnut useissa vastaanottokeskuksissa – Yle koosti syksyn tapaukset yhteen.” Yle. 10 December, 2015. Available at: https://yle.fi/a/3-8517586.
[5] Eero Mäntymaa, ”Polttopulloja ja katon poraamista – Poliisi: Vastaanottokeskuksiin iskevät Suomessa humalaiset rasistit.” Yle. 28 December 2015. Available at: https://yle.fi/a/3-8549219.
[6] Miah Hammond-Errey, “Elon Musk’s Twitter Is Becoming a Sewer of Disinformation: Changes to the platform have systematically amplified authoritarian state propaganda.” Foreign Policy. Available at: https://foreignpolicy.com/2023/07/15/elon-musk-twitter-blue-checks-verification-disinformation-propaganda-russia-china-trust-safety/.