How do Technological Advancements Shape Modern Warfare? A Focus on 5GW and Hybrid Warfare in Recent Conflicts 

Author: Abby McAloon

Contributing researchers and editors: Dr Nadia Musa, Dr Mikael Leidenhag
28/10/2025 

 

What is 5G and Hybrid Warfare? 

In an era centered on technological advancements, the notion of warfare is currently undergoing a profound transformation. Traditionally, conflict centered on military strategies aimed at weakening enemy forces; today, modern warfare relies heavily on high-tech surveillance, such as drones, advanced monitoring systems, and cyber operations warfare. This concept is known as 5GW, or Fifth Generation Warfare, which focuses on non-kinetic actions rather than traditional conflict tactics.[1] For example, conventional warfare tactics employed traditional weapons such as tanks and artillery, aiming to deplete enemy resources, whereas 5GW uses tools like artificial intelligence, psychological warfare, and shaping perceptions.[2] Hybrid warfare is commonly used in recent conflicts and is the synchronised use of multiple tools of power, such as combining conventional and unconventional methods of military action, cyberattacks, and propaganda. 

Unlike previous generations of warfare, 5GW is the battle of narratives and information.[3] Many theorists regard 5GW as one of the deadliest forms of warfare in history, because its secrecy and dispersion of violence distort public perception, leading to a manipulated and heavily influenced worldview.[4] In The 5GW Handbook by Abbott (2010), there is a collection of articles on the emergence of 5GW and suggests that eventually 5GW would lead to a cultural and moral conflict that is fought through manipulating perceptions and altering worldviews.[5] This highlights that 5GW is not measured by the tools of war, but by its tactics that go beyond physical force. In this format of war, the battlefield exists everywhere – from our phones to our values. 

 

How Disinformation is Used as a Tool

Shaping public opinions and manipulating the narrative play a major role in 5GW. Contemporary society consists of generations raised with technology as a central part of their lives. From socialising to schoolwork, we all rely on modern technology. So, how can something that makes our lives so much simpler be used as a tool of war? 

Conflict has a history of misinformation, with propaganda being a crucial tool during World War II. Current tools exist in a more subtle way with media campaigns, and social media being used as instruments. Technology has transformed the rules of war with the development of information technology, artificial intelligence, and advanced communication sources.[6] Social media platforms such as Facebook, Instagram, and X (formerly “Twitter”) have become fertile grounds for the creation of fake news and misleading reports. This concept is known as media warfare, where information and communication technologies are used to achieve a strategic advantage over an opponent. 

5GW and hybrid tactics involve the digital space and blur the line between combatants and civilians, conflict and peace, and reality and perception. Many conventional military forces, external actors, proxy militias, and insurgent groups partake in a mix of traditional and non-kinetic operations. Hybrid warfare is not a new concept. However, it gained prominence after the annexation of Crimea by Russia in 2014 and the full-scale Ukrainian war. Russia previously used technological tools such as media channels to frame opinions and government-controlled internet ‘trolling.’[7] Russian TV channels have contributed to shaping narratives and public opinion, with the Russian Presidential Administration having coordinated control over the budgets for advertising and editorial content while allowing freedom to a small number of minor independent media outlets to allude to media freedom.[8]   For example, RT, formerly ’Russia Today,’ is a state-controlled news network that is used for information laundering. RT is blocked throughout Europe, but it can still be accessed when third party websites repost the content, bypassing restrictions and allowing further reach for the content.[9]  Furthermore, Russia implemented a disinformation campaign in 2022 to spread pro-Russian narratives as an attack against Western and NATO allies. This campaign is known as “Doppelgänger” and uses techniques such as inauthentic social media accounts, fake news media portals, and forged copies of mainstream media websites.[10]  The effectiveness of these mechanisms can be attributed to various factors. One being the legal loopholes that surround AI legislation globally. The European Union approved the AI Act (2024) which was introduced to provide a framework for the use of AI.[11] This is a staggered process with stages being approved and applied slowly, with the Act being approved in August 2024, and the first stage implemented in August 2025. This Act is vague in definitions surrounding AI systems and AI models, leaving confusion on who needs to abide by the legislation. This leaves room for hostile actors to take advantage of loopholes and use AI to spread disinformation. The ”Doppelgänger” campaign operates on the network X, formerly Twitter, and employs bots to imitate real users, aiming to influence others’ views through pro-Russian content and to portray Ukraine negatively.[12] This has been referred to as digital Potemkin Villages. The spread of disinformation on global platforms contributes to fearmongering and acts as a weapon to destabilise and undermine. It is used pre- and post-war in many situations with Russian disinformation campaigns being linked to the Brexit vote (2016) and the 2019 European Parliament elections.[13] Therefore, 5GW and hybrid techniques can be used by states to combine regular and irregular tactics to maximise the effectiveness of causing instability.

In recent years, 5GW has been used in conflicts in the Middle East, particularly in Syria. The rise in technology can be helpful to civilians and targets of conflict. It can help to tackle the spread of misinformation and provide evidence from those on the ground. In 2017, the town of Khan Shaykhun in Syria was targeted by a chemical attack, which killed at least 80 people.[14] The Assad regime, alongside its ally Russia, blamed it on rebels and the United States, claiming it was a fabrication. However, the use of mobile phones and social media allowed the attack to be shown globally, which made people all over the world witness it. After detailed reports from Doctors Without Borders, medics and the Organisation for the Prohibition of Chemical Weapons, the use of Sarin was confirmed coming from a Syrian jet ordered by the government.[15] This shows that, on the one hand, misinformation can be a tool used by states to promote their agenda. On the other hand, it can also enable civilians to share their perspectives, counteracting the misuse of social media platforms. 

 

Why Tackling Disinformation in 5G and Hybrid Warfare is Important

The United Nations Global Risk Report (2024)[16] ranks mis/disinformation as a top global threat, one that countries are least prepared to face, with a ’lack of trust’ being a barrier to combatting this. As aforementioned, 5GW and Hybrid Warfare are deliberate tactics used to destabilise opponents by shaping public opinions and spreading misinformation to promote certain narratives. Many civilians do not realise they are part of ongoing conflicts as the violence is discreet, and the battlefield is vast. They blur the line between civilians and combatants, usually increasing the role that non-military personnel play in war.[17] In conflict zones, time is of the essence; social media and access to technology can be used for good, such as directing civilians to safe points and providing reliable information.

However, technology is also a platform that can be used to tarnish the reputation of humanitarian workers, promote disinformation about groups of society, which results in an increase of hate crimes and violence during conflict.[18] Therefore, false narratives can be used as a weapon to entrench and increase existing divisions between civilian populations, facilitate recruitment for armed actors, and eventually justify military action. It is vital to tackle the spread of disinformation to reduce the spread of fear. It is also important to note that digital access and technology itself are not inherently evil, but they are tools used to promote specific notions surrounding conflicts, governments, and groups of society. 

 

Recommendations

The persistence of disinformation campaigns indicates that a whole-of-society approach is required, including collaboration between media platforms, civil society, and governments. It is important to uncover and block the tactics of disinformation campaigns before their manipulative methods can generate significant reach across media channels; for example, through thousands of automated bot accounts. The key policy recommendations include:

  • A collaborative approach between technology firms, the media, the government, and civil society is needed to formulate regulatory policies, enhance the ability of Artificial Intelligence in identifying extremist content, and increase media literacy. Many social media users share content without fact-checking, thus increasing the reach of disinformation campaigns. Increasing international cooperation is vital in combatting this transnational threat effectively and to formulate a response that secures national safety without infringing on individual liberties. Furthermore, creating this relationship promotes transparency between platforms and users, enabling the monitoring and curbing of the spread of misinformation. Everyone has a responsibility to not abuse AI tools, and this responsibility should be shared among all stakeholders. Companies that deploy generative AI tools should label fake content to help curb the spread of fake news. Governments should establish clear guidelines for technology firms and work in collaboration to ensure transparency in order to avoid hostile actors taking advantage of loopholes. Civil society, including the media, should promote media literacy and help to equip citizens with the skills to identify disinformation.
     
  • Promoting cultural initiatives and youth programmes is vital in fostering societal awareness against extremism. Correcting disinformation after it has been shared can be effective. However, it is important to create public resilience against misinformation in advance. Studies have shown that psychological inoculation can help reduce susceptibility to misinformation.[19] There have been programmes introduced globally to help tackle this issue. For example, events have been hosted by the PSIA Tech and Global Affairs Innovation Hub in collaboration with the Paris Peace Forum, exploring how young people can help tackle the spread of disinformation. Young citizens were invited to observe discussions surrounding the rise of AI, how AI can be used as a weapon, and even share their own ideas surrounding the topic.[20] This type of programme helps to increase awareness and promotes societal awareness, allowing individuals to build the skills needed to identify the overarching techniques generally used in misinformation campaigns.
     
  • Evaluating how human oversight will be incorporated into the use of AI, such as in warfare or training, or where decisions about national security include AI-derived intelligence. This includes Human-in-the-loop (HITL), Human-on-the-loop (HOTL), and Human-in-command (HIC) at varying stages of AI projects. This will allow for health and safety risks to be minimised and for the use of AI to be in accordance with the law. Including these tactics will allow for ethical concerns to be considered and prevent the intentional misuse of AI systems, including manipulation tactics. 
     

References 

[1] Chang, Y., Keblis, M.F., Li, R., Iakovou, E., & White, C.C. (2022). "Misinformation and Disinformation in Modern Warfare," Operations Research, INFORMS, vol. 70(3), pages 1577-1597.

[2] Krishnan, A. (2024). Fifth Generation Warfare: Dominating the Human Domain (1st ed.). Routledge. https://doi.org/10.4324/9781003396963

[3] Qureshi, W. A. (2019). Fourth- and Fifth-Generation Warfare: Technology and Perceptions. Available: https://digital.sandiego.edu/cgi/viewcontent.cgi?article=1293&context=ilj [Accessed 2nd September 2025] 

[4] Abbott., D.H (ed.). (2010). The Handbook of 5GW: A Fifth Generation of Warfare? (ed.) Ann Arbor, MI: Nimble Books LLC,

[5] Abbott., D.H (ed.). (2010). The Handbook of 5GW: A Fifth Generation of Warfare? (ed.) Ann Arbor, MI: Nimble Books LLC,

[6] Azad, T. M. (2020). “Understanding the International Propaganda Patterns Against Pakistan.” In Fake News: Unravelling the greater complexity of how individuals, institutions and whole nations manipulate facts to create credible fake news to their advantage. Institute of Regional Studies.

[7] Obreja, D. M. (2024). The “Russian bots” between social and technological: Examining the ordinary folk theories of Twitter users. New Media & Society, 0(0). https://doi.org/10.1177/14614448241255692

[8] Federal Foreign Office. (2024). Technical Report on an Analysis by the Federal Foreign Office 5 June 2024 Germany Targeted by the Pro-Russian Disinformation Campaign “Doppelgänger. Available at: technischer-bericht-desinformationskampagne-doppelgaenger-1--data.pdf [Accessed 30th August 2025] 

[9] Schafer, B., Benzoni, P., Koronska, K., Rogers, R and Reyes, K. (2024). The Russian Propaganda Nesting Doll: How RT is Layered Into the Digital Information Environment. The German Marshall Forum. Available: https://www.gmfus.org/news/russian-propaganda-nesting-doll-how-rt-layered-digital-information-environment [Accessed 10th October 2025]

[10] Pamment, J and Tsurtsumia, D. (2025). Beyond Operation Doppelgänger: A capability assessment of the social design agency. Psychological Defence Research Institute. Available at: https://www.psychologicaldefence.lu.se/article/beyond-operation-doppelganger-capability-assessment-social-design-agency [Accessed: 03 September 2025]

[11] Official Journal of the European Union. (2024). Laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act). European Union. Available at: http://data.europa.eu/eli/reg/2024/1689/oj

[12] Smart, B., Watt, J., Benedetti, S., Mitchell, L., & Roughan, M. (2022). ‘#IStandWithPutin Versus #IStandWithUkraine: The Interaction of Bots and Humans in Discussion of the Russia/Ukraine War.’  In: Hopfgartner, F., Jaidka, K., Mayr, P., Jose, J., Breitsohl, J. (eds.) Social Informatics. SocInfo. Springer, Cham. Available at:  https://doi.org/10.1007/978-3-031-19097-1_3 

[13]  Legucka, G. (2020, March 19). Russia’s Long-Term Campaign of Disinformation in Europe. Carnegie Europe. 

[14] Brooks, J., Erickson, T.B., Kayden, S., Ruiz, R., Wilkinson, S., and  Burkle Jr, F.M (2018). “Responding to chemical weapons violations in Syria: legal, health, and humanitarian recommendations”. Confl Health 12, 12. https://doi.org/10.1186/s13031-018-0143-3 

[15] Ahmad, M.I. (2024). “Open Source Journalism, Misinformation and the War for Truth in Syria.” In Open Source Investigations in the Age of Google. Security Science and Technology, pp. 133–151. 

[16] United Nations Global Risk Report. (2024). New York: United Nations. Available at:  UNHQ-GlobalRiskReport-WEB-FIN.pdf [Accessed 2nd September 2025]

[17] Meier, M. (2019). “Emerging Technologies and the Principle of Distinction: A Further Blurring of the Lines Between Combatants and Civilians?” The Impact of Emerging Technologies on the Law of Armed Conflict, 211-234DOI: 10.1093/oso/9780190915322.003.0008

[18] Ulbricht B, Rizk J. (2024). How harmful information on social media impacts people affected by armed conflict: A typology of harms. International Review of the Red Cross, 106(926): 823-862. 

[19] Pilditch, T. D., Roozenbeek, J., Madsen, J. K., & van der Linden, S. (2022). “Psychological inoculation can reduce susceptibility to misinformation in large rational agent networks”. Royal Society Open Science, 9(8), 211953. https://doi.org/10.1098/rsos.211953

[20] What are your ideas for protecting democracies from disinformation (fake news, AI, attempts to influence...?). Make.org. Available at: Stop fakes EN - Make.org x SciencesPo x OTAN x Microsoft - Final Report - 02072024_EN.pptx [Accessed 2nd October 2025]  

 

©Copyright Information Integrity Partnership 2026. All rights reserved.

We need your consent to load the translations

We use a third-party service to translate the website content that may collect data about your activity. Please review the details in the privacy policy and accept the service to view the translations.