Robocop And The Classified Directive


(MENAFN- Asia Times) The US Air Force has conducted what it now calls a“thought experiment” involving a drone run by artificial intelligence.
The drone was supposed to destroy an air defense system.
The drone was rewarded with points for successfully carrying out its mission.
At times the human operator sent instructions to the drone not to destroy the air defense system, meaning the drone would not earn points. The drone found a solution: it destroyed the communications tower sending the drone commands to override its mission,“killing” the operator.

The drone had been taught not to kill its operator, as it would lose points for doing so.

But the drone disregarded its instructions and decided the mission was more important than the points.

The Air Force
has officially denied
early reports that musings about such a scenario were tested in a simulation stage in which the drone“killed” its operator.

The scenario was anticipated in the 1987 film, Robocop, perhaps the best sci-fi film of the 1980s, perhaps the best ever.



The Robocop story is about a Detroit Police Officer who is murdered.
He is revivified by a private company, Omni Consumer
Products, owned by Megacorporation.
Omni Consumer Products has been tasked to enforce law and order in Detroit because the police force in that city failed to do so.
(If the failure to enforce law and order sounds familiar, it should.)

The newly minted cyborg, Murphy, is given four orders.
Three of the orders comply with the Omni Consumer Products mission in Detroit:
serve the public trust, protect the innocent, and uphold the law.
A fourth order is classified: it is not to act against any executive of Omni Consumer Products.

In the interim, the Cyborg starts to regain some of his lost memory and humanity.
When he tries to shoot the chief of Omni Consumer Products, who has committed murder, the fourth classified directive kicks in.
The Murphy cyborg hesitates but cannot override the built-in directive.
In that tense moment, the executive is fired by the company Board of Directors, so Cyborg Murphy can attack him, which he does.

Robocop cannot overcome the fourth classified directive on his own, which makes the outcome different from the Air Force simulation, in which the drone disobeyed its orders.

If the Air Force wishes to proceed from the imagination stage to simulation, it will have to try and figure out how the drone, which has no human memory because it is not human, could manage to kill its operator. Possibly this would prove to be a challenging task.
The short term fix would be to strengthen the operational rules for the drone.
For example, if the drone tries to attack its operator it will be destroyed.
Other quick fixes also are possible.

The main point is that artificial intelligence is in very tricky waters: While it does not, strictly speaking, include human memory, it does include human thinking patterns that are emulated.
Even the developers and programmers may not understand exactly what they built into these modern cyborgs.

Artificial intelligence is only in its infancy, and it has many civilian and military applications.
The more complex these systems become, the more they can become autonomous and independent even of their programming.

While Robocop introduces a human dimension for Murphy as he rediscovers his human past, he still cannot overcome the fourth directive.

The satire in the film, of course, relates to the extreme violence used by Robocop, both as Murphy the police officer and Cyborg Murphy.

When the Cyborg is asked his name in the closing scene of the film, he answers“Murphy.”
But this Murphy is not human, after all, as shown by the fact he cannot extricate himself from the fourth directive.

Real or imaginary, the Air Force experimental drone is much worse because it can disobey rules and make decisions on its own. Has the drone become some sort of human?

The Air Force may want to watch Robocop.

stephen bryen
is a senior fellow at the
center for security policy
and the
yorktown institute .
This article was originally published
on his Substack, Weapons and Strategy.
Asia Times is republishing it with permission.

Like this:Like Loading...

MENAFN02062023000159011032ID1106380167


Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.