Who will be liable if a robot commits a crime?

The future as predicted by many scientists holds a lot of things but one interesting part is the existence of robots that will rule the world. Law makers should start implementing laws that can accommodate intelligent machines that can commit crimes. This week a case arose in a Volkswagen production plant in Germany where a worker was killed by a robot. The case will be hard to determine liability because robots are not human and they work independently because of there installed intelligence.

Robots in the future will have the ability to cause damage or hackers will probably take that opportunity to commit crimes using robots like stealing from banks. Some robots will be created by governments to control crimes or cover governmental peace keeping missions but since they are machines accidents or errors could occur. What then? Who will be liable if an intelligent machine goes haywire?

Also read: IDEA FRIDAY: The Elon Muskian Future of Driverless Cars

Technology is evolving very fast and people can not just seat and relax because robots will soon run our daily activities. In April a robot was arrested for buying a Hungarian passport, Ecstasy pills, fake Diesel jeans, a Sprite can with a hole cut out in order to stash cash, Nike trainers, a baseball cap with a hidden camera, cigarettes and the “Lord of the Rings” e-book collection. This is a clear indication that the world is evolving very first and in the future we are going to experience difficult legal challenges.

Guilty intelligent machine mind

One of the functions of our legal system is to regulate the behavior of legal persons and to punish and deter offenders. It also provides remedies for those who have suffered, or are at risk of suffering harm. Legal persons – humans, but also companies and other organizations for the purpose of the law – are subject to rights and responsibilities. Those who design, operate, build or sell intelligent machines have legal duties – what about the machines themselves? Our mobile phone, even with Cortana or Siri attached, does not fit the conventions for a legal person. But what if the autonomous decisions of their more advanced descendents in the future cause harm or damage?

Criminal law has two important concepts. First, that liability arises when harm has been or is likely to be caused by any act or omission. Physical devices such as Google’s driverless car, for example, clearly has the potential to harm, kill or damage property. Software also has the potential to cause physical harm, but the risks may extend to less immediate forms of damage such as financial loss.

Second, criminal law often requires culpability in the offender, what is known as the “guilty mind” or mens rea – the principle being that the offence, and subsequent punishment, reflects the offender’s state of mind and role in proceedings. This generally means that deliberate actions are punished more severely than careless ones. This poses a problem, in terms of treating autonomous intelligent machines under the law: how do we demonstrate the intentions of a non-human, and how can we do this within existing criminal law principles?

Crimes committed by robots

Robots commit crimes around the world but the liability always fall on corporation’s negligence. Sometimes the designer or the manufacturers face criminal charges and the user is left free. The current legal system is not formulated in a way to pass a ruling that a robot can commit a crime it always assumes that human operators are involved.

A good example can be seen in the highways. The regulatory framework assumes that there is a human driver to at least some degree. Once fully autonomous vehicles arrive, that framework will require substantial changes to address the new interactions between human and machine on the road. As intelligent technology that by-passes direct human control becomes more advanced and more widespread, these questions of risk, fault and punishment will become more pertinent. Film and television may dwell on the most extreme examples, but the legal realities are best not left to fiction. Reports The Conversation.

The modern jurisprudence which apparently began in the 18th century that focuses on the first principles of the natural law, civil law and the law of nations should be challenged and implemented by the jurists to accomodate the innovations that are being developed daily.


Erick Vateta564 Posts

--- Erick Vateta is a lawyer by training, poet, script and creative writer by talent, a model, and tech enthusiast. He covers International tech trends, data security and cyber attacks.


Welcome! Login in to your account

Remember me Lost your password?

Lost Password