Karl`s PC Help Forums Last active: Never
Not logged in [Login ]
Go To Bottom

In memory of Karl Davis, founder of this board, who made his final journey 12th June 2007

Printable Version | Subscribe | Add to Favourites   Post new thread Poll:
Author: Subject: AI Security Devices, And AI Battle Devices Topics
JackInCT
Custom User Title
*******




Posts: 1415
Registered: 21-4-2007
Theme: KF Blue (Default)
Member Is Offline

Mood: No Mood

[*] Post 508587 posted on 19-7-2017 at 13:10 Reply With Quote
AI Security Devices, And AI Battle Devices Topics



I've decided to break this out into a combined topic. This topic, in part, is a kind of addendum to my topic yesterday of the security robot in Washington winding up in a water pool. I'm attempting to combine two separate aspects of AI devices, specifically those being used for local security type patrol work and military purposes.

In one of the replies to the topic of the wet security robot, LSemmens posed the question,..."it was supposed to be a security robot. How's that work? Does it zap you with a bolt of lighting if you do something wrong." Let me reply to his question:

Part 1 of this topic is this question: as everyone on this board knows, gun ownership in the USA is legal per the 2nd amendment of the USA Constitution. Within the framework of the currents laws, should an AI device regardless of whether it is coded for security duties, be allowed to carry a legal handgun (note: there are legal restrictions as to what kinds of guns can be legally owned), i. e., should they be allowed to employ deadly force?

Part 2: There was a 2nd article on CNN yesterday that dealt with "robots", i. e., known as "autonomous weapons systems" automatically engaging an "enemy" without human authorization re a kill strike.

USA General Warns Of Out-Of-Control Killer Robots

URL for this article:
http://www.cnn.com/2017/07/18/politics/paul-selva-gary-peters-autonomous-weapons-killer-robots/index.html

Me here: I've made selective editing of this article:

America's second-highest ranking military officer, Gen. Paul Selva, advocated Tuesday for "keeping the ethical rules of war in place lest we unleash on humanity a set of robots that we don't know how to control."

Selva was responding to a question from Sen. Gary Peters, a Michigan Democrat, about his views on a Department of Defense directive that requires a human operator to be kept in the decision-making process when it comes to the taking of human life by autonomous weapons systems.

Peters said the restriction was "due to expire later this year."

..."ban on offensive autonomous weapons beyond meaningful human control."

But Peters warned that America's adversaries may be less hesitant to adopt such lethal technology.

"Our adversaries often do not to consider the same moral and ethical issues that we consider each and every day," the senator told Selva.

Me here: 'considering moral and ethical issues' is one thing; living them is quite another matter entirely. ....."America's adversaries may be less hesitant to adopt such lethal technology." IMO USA politico types excel at 'holier than thou', especially when it comes to waging war on folks with a different skin color than white. Many, many USA citizens are completely indifferent (to put it mildly) to who its military forces are killing. And the related question of does all this warfare actually bred new generations of adversaries who see USA forces as simply engaging in ethnic/religious cleansing?
View User's Profile View All Posts By User
LSemmens
Undercover MOD
********


Avatar


Posts: 32767
Registered: 19-11-2004
Location: Riverton, South Australia
Theme: Windows XP Silver
Member Is Offline

Mood: Gone crazy, Back soo

[*] Post 508589 posted on 20-7-2017 at 03:12 Reply With Quote


That laser gun would have to be only part of the picture. I wonder how it would cope with cloud, or other light dispersing media. What if an aircraft had a mirror finish on it?

Autonomous weapons are all well and good and I'd be happy to see wars fought only by machine against machine, however, that is never going to happen, human nature being what it is. Even today's guided missiles and targetting systems, whilst very accurate, are still indiscriminate in collateral damage. Yes they can shoot a pimple off a fly's bum, but what happens when said fly then falls out of the sky? Or if, as the cowards who fight for some radical sects like to do, they hide behind innocent civilians.

War is a dirty business, and we are far more selective in our targets these days. So collateral damage is minimised compared to, say World War II. If we could return to the days of horseback and bows and arrows, we might be able to limit civilian casualties, but that ain't going to happen.
View User's Profile View All Posts By User
marymary100
Underwater Plumber
********


Avatar


Posts: 32196
Registered: 9-5-2004
Location: Scotia
Theme: Iconic
Member Is Online

Mood: fact me

[*] Post 508591 posted on 20-7-2017 at 11:08 Reply With Quote


I'm with Asimov


A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws
View User's Profile View All Posts By User
Post new thread Poll:

Guest Notice
You are a guest, as a guest you can only see a maximum of 3 posts per thread.

If you want to see the rest, please click here to register.