Do you support autonomously lethal robot warfighters?

Discussion in 'Security & Defenses' started by modernpaladin, Feb 24, 2019.

?

Do you support the use of autonously lethal robot warfighters?

  1. No. I believe there should be a human conscience deciding whether to 'pull the trigger.'

    3 vote(s)
    75.0%
  2. Yes. I believe a machine can be programmed to 'make the right decision' regarding killing a human.

    1 vote(s)
    25.0%
  1. modernpaladin

    modernpaladin Well-Known Member

    Joined:
    Apr 23, 2017
    Messages:
    9,930
    Likes Received:
    5,711
    Trophy Points:
    113
    Gender:
    Male
    The precedent has been set that subjecting only men to a military draft is not conducive to equality.

    IMO this is precursor to drone troops.

    One of two outcomes is innevitable:
    a) the draft will be expanded to include women
    b) the draft will be abandonned

    In the case of A, the next big war will see a lot of women casualties, which I predict will undermine domestic morale to the point where we cannot sustain a war on foreign soil. The noble cause of gender equality will never overcome the genetically preset human need to protect women from violence. The MIC will use this as an excuse to greenlight autonomously lethal robots to compensate.

    In the case of B, a lack of available 'boots on the ground' will be the excuse to greenlight autonomously lethal robots to compensate.

    Are autonomous battle drones a 'good thing'?

    Historically, we as a society have maintained that there needs to be a human 'conscience' behind 'the trigger,' that programming machines to kill, even in war, runs counter to our human values.

    Is that still the case?
     
  2. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    9,568
    Likes Received:
    762
    Trophy Points:
    113
    I simply believe that nothing but a human should have the power to put another human to death.

    And it even goes beyond that. You can write a perfect logic algorithm to judge criminals, and remove any kind of bias, prejudice or emotion from a criminal trial on the idea that it will make justice truly "blind". And I would be 100% against that as well.

    Trying to pass along decisions of life and death into any kind of artificial intelligence is the most inhumane thing possible.

    We have to many examples both in real life and in science fiction where AIs go out of control. Both on purpose, or accidental.
     
  3. JakeStarkey

    JakeStarkey Well-Known Member

    Joined:
    Sep 4, 2016
    Messages:
    23,427
    Likes Received:
    8,520
    Trophy Points:
    113
    We have no business sending our troops to foreign soil.

    We can send our fighterbots of all sorts.
     
  4. kazenatsu

    kazenatsu Well-Known Member Donor

    Joined:
    May 15, 2017
    Messages:
    12,163
    Likes Received:
    3,438
    Trophy Points:
    113
    Yes, but I have very mixed feelings about it, in many cases.
     
    JakeStarkey likes this.
  5. kazenatsu

    kazenatsu Well-Known Member Donor

    Joined:
    May 15, 2017
    Messages:
    12,163
    Likes Received:
    3,438
    Trophy Points:
    113
    We would all hope that robots could drastically reduce soldier casualties on the battlefield, but that may just be naivety and lack of foresight on our part.

    Consider that the inventor of the machine gun thought it would lead to fewer soldiers being killed because there wouldn't have to be as many soldiers on the battlefield to shoot guns. Instead, and what he didn't foresee, was that it just ended up making war more deadly.
     
    Last edited: Mar 3, 2019
    JakeStarkey likes this.
  6. Mushroom

    Mushroom Well-Known Member

    Joined:
    Jul 13, 2009
    Messages:
    9,568
    Likes Received:
    762
    Trophy Points:
    113
    That is pretty much the original thought of every inventor of a "new" weapon. That either it would be so horrible nobody would want to face it, or that it would be so decisive that it would quickly end a conflict.

    But pretty much every time, what actually happened is that the other side copied or countered it with something else, and things continued as before.

    Of his own invention, Thomas Gatling said:

    And when Hirum Maxim's new gun was first sent to Sudan, it was trumpeted that "merely exhibiting the gun would prove to be a great peace-preserver".

    Instead, what enemies tended to do is to make even larger armies, and hide behind cover until they could rush the gun. Which caused the other side to amass even larger forces so they could defend the gun. Until WWI where both sides dug in behind hundreds of miles of wire and tunnels and charged back and forth in endless waves in a stalemate that lasted for years.
     
  7. scarlet witch

    scarlet witch Well-Known Member Past Donor

    Joined:
    Feb 26, 2016
    Messages:
    8,349
    Likes Received:
    4,823
    Trophy Points:
    113
    Gender:
    Female
  8. HonestJoe

    HonestJoe Well-Known Member Past Donor

    Joined:
    Oct 28, 2010
    Messages:
    11,430
    Likes Received:
    1,867
    Trophy Points:
    113
    I think there are flaws in this logic. First, I don’t think the US military will ever need (or want) to actually draft anyone. The draft was from the days when conflicts were generally won by simple force of infantry numbers. The nature of modern warfare has change massively since then meaning the military need a smaller number of highly skilled individuals rather than a mass of grunts. That doesn’t mean they don’t struggle with numbers when they have a lot of long-running conflicts (as has been the case already) but a standard draft doesn’t offer much of a solution to that problem.

    On a similar basis, full autonomous drones aren’t going to be replacing infantry in the foreseeable future – that’s far too complicated and diverse a role. As we’ve already seen with semi-autonomous drones, they work better replacing manned aircraft and as commentary tools for infantry. You’re never going to replace “boots on the ground” with robots, especially given the nature of today’s conflicts.

    I don’t see that principle changing in the foreseeable future, not only for moral reasons but for simple practical ones.
     

Share This Page