1. 钥匙
  2. legislation
  3. Friday, 30 November 2018
  4.  Subscribe via email
Use of 'killer robots' in wars would breach law, Do Killer Robots Violate Human Rights? Why banning autonomous weapons is not a good idea?
Comment
There are no comments made yet.
Accepted Answer Pending Moderation
Machines would make life-and-death determinations outside of human control. The risk of disproportionate harm or erroneous targeting of civilians would increase. No person could be held responsible.
Comment
There are no comments made yet.
  1. more than a month ago
  2. legislation
  3. # 1
Accepted Answer Pending Moderation
The human creators and operators of autonomous robots must be held accountable for the machines’ actions. If a programmer gets an entire village blown up by mistake, he should be criminally prosecuted, not get away scot-free or merely be punished with a monetary fine his employer’s insurance company will end up paying. Similarly, if some future commander deploys an autonomous robot and it turns out that the commands or programs he authorized the robot to operate under somehow contributed to a violation of the laws of war, or if his robot were deployed into a situation where a reasonable person could guess that harm would occur, even unintentionally, then it is proper to hold the commander responsible.
Comment
There are no comments made yet.
  1. more than a month ago
  2. legislation
  3. # 2
Accepted Answer Pending Moderation
As nations continue to invest in and develop autonomous weapons systems or artificial intelligence for battlefield purposes, calls for international regulation are increasing. Wars of the future could see AI-powered weapons, ships and aircraft deployed to the battlefield without being subject to human control or monitoring. According to a new report published by Human Rights Watch, in collaboration with Harvard Law School‘s International Human Rights Clinic, the two organisations claim that autonomous weapon systems would violate the Martens Clause; a widely-acknowledged provision of international humanitarian law.
Comment
There are no comments made yet.
  1. more than a month ago
  2. legislation
  3. # 3
Accepted Answer Pending Moderation
It's against the international law
Comment
There are no comments made yet.
  1. more than a month ago
  2. legislation
  3. # 4
Accepted Answer Pending Moderation
The degree of autonomy in weapons systems is steadily increasing, thanks to rapid progress in the fields of artificial intelligence (AI) and robotics. Machines are now capable of learning;they process experience by means of artificial neural networks similar to the human brain. The arms industry is making use of this. Weapons are becoming faster and more efficient, while the danger for the soldiers using them decreases. This is precisely what armies want. However, the boundaries are fluid. A robot that autonomously seeks, recognizes and defuses mines may generally be accepted, while a robot that autonomously seeks, recognizes and shoots people clearly contravenes international humanitarian law.
Comment
There are no comments made yet.
  1. more than a month ago
  2. legislation
  3. # 5
  • Page :
  • 1


There are no replies made for this post yet.
However, you are not allowed to reply to this post.