Yeah, I'm gonna have to agree with that guy Noel Sharkey that was mentioned in the article. In order for a machine to make moral decisions, it would have to be capable of independent and conscious thought. Machines simply don't have that. Artificial intelligence is simply that: artificial. It's an imitation of intelligence, and nothing more. While a machine can certainly be programmed to follow a complex flowchart of questions and answers, its decision making ability will ultimately be limited to whatever the designers and programs who build it can come up with.
I'm all for technological advancement, but let's not give machines the authority to make life and death decisions autonomously, especially in combat.
I believe this was actually a major plot point for that movie "I, Robot" starring Will Smith that came out a few years ago.
I'm all for technological advancement, but let's not give machines the authority to make life and death decisions autonomously, especially in combat.
I believe this was actually a major plot point for that movie "I, Robot" starring Will Smith that came out a few years ago.
Plus, who can forget that scene from the original "Robocop"?
https://www.youtube.com/watch?v=Hzlt7IbT...