Connect with us
Latest News

How AI Will Rewrite the Laws of War

Published

on

War isn’t fought the same way anymore, and it’s also not as simple as it once was to determine what is fair. Drones that fly and target-specific machines change the landscape of modern warfare. Old laws need to change because modern technology is outpacing them.  

The Limits of Existing War Laws

Wars require a framework like the Hague Restrictions and the Geneva Conventions, which help mitigate suffering and prevent the rise of cruelty. These frameworks are similar to the cricket betting line; they have limits that show players the risks involved in participating. But today these boundaries seem to be vanishing.

Additionally, assigning responsibility becomes more challenging when a drone strike goes wrong. In the case of a drone strike, the pilot is not sitting in a cockpit and does not operate it manually. They’re miles away from the ground. Thus, no one directly engages with the “vehicle,” greatly complicating placing accountability on a singular entity. Is it the programmer who makes the AI algorithms? Is it the officer who commanded the strike? Is it the entity that designed and built the drone? Or is it none of them? 

Autonomous Weapons and Responsibility Gaps

AI has access to and the capability to assist humans in making the most difficult decisions of all. But take a moment to think about this: if a robot takes a human’s life, who would be accused of murder? Things get perplexing in this part:

  • Missing hierarchy: Everyone, from coders to military officers to users, is included. The issue is: who decides the hierarchy?
  • Unpredictable actions: AI’s self-learning has proven effective and efficient time and again; however, in unusual scenarios or with sudden changes, it can behave unexpectedly.
  • Familiar frameworks: Current laws are designed around humans, and because of that, AI-powered technologies leave courts to wild guesses on borderline tasks. 

The legal gaps this issue creates can also be defined as an ethical gap. But a person may get hurt, and no one is accountable. Furthermore, for the civilians caught in the crossfire, it does not matter if the destruction is caused by the hand of a human or a machine. The devastation is identical. 

Shifting Legal Definitions in AI Warfare

AI has begun making decisions in wars. The terms “soldier” or “commander” no longer apply, as these refer to lines of code. This transformation is something we witness in our daily lives, such as smart systems making decisions about content on platforms like Melbet Instagram. Technology is present everywhere now. Today’s warfare extends beyond traditional tanks and guns; machines devoid of emotion make swift, automated decisions using data and code. 

Redefining Combatants in Human-Machine Team

What if a fully equipped soldier’s real-time communication system gave them advice as they moved alongside a self-operating robot capable of shooting? Where is the line between a human’s decision-making and a machine’s action?  

The traditional concept of a fighter — usually a single individual who single-handedly carries the burden of complex moral decisions — is now mixed with tools that lack empathy. If more and more of the heavy lifting is done by machines, who is exercising control? Such questions do not serve a theoretical purpose alone — they determine whether justice is rendered or obliterated in algorithms.

The Question of AI Intent and Liability

Laws have principles that the social order emerges from purpose — people intend to do something or not do something. AI isn’t like that. It processes data and follows orders. So what happens when it harms without any rage or motive?

Responsibility must be accounted for — some entity needs to take the blame. In this case, the task of meting out blame is elusive. If a system wrongly attacks innocent people, we can’t ask it why — it doesn’t think or feel. The answer lies in courts dealing with “voids” that do not believe or regret anything. If there is no clear person who can be held responsible, “justice” is unreachable.

The Need for New Global Treaties

Governments are busy investing billions in smart weaponry, but the laws only account for soldiers with guns, not for machines with software. Most nations lazily test these weapons behind closed doors, slowly passing regulations while pretending everything is fine.  

Without set borders, machines could potentially traverse boundaries and execute devastating attacks unaccompanied and unmanned — and no one may claim accountability. If these new guidelines are not established, a large portion of humanity will be lost alongside control.  

Toward a New Era of War and Accountability

Decisions that AI will shift the blame for can and will be made, and thus, new borders defining responsibilities need to be drawn. If it gets to decide who lives and dies, someone should take accountability. Without accountability, fairness is compromised, and war becomes a lot more painful.

Continue Reading

Popular Topics on Betterthisworld.com