All Discussions > Steam Forums > Off Topic > Topic Details
Arvaos Jul 25, 2024 @ 2:15pm
Weaponized AI
Imagine an AI model calculating every precise battle movement, reaction, and tactic in real time.

In the entire history of our species, humans have NEVER failed to at least try to weaponise every new technology we've come up with.

How do you feel about weaponized AI?
< >
Showing 1-15 of 34 comments
Soren Jul 25, 2024 @ 2:24pm 
Companies that tried incorporating it into their decision making workload stopped immediately after they lost lawsuits and had to payout money.

Airline companies tried to use AI for customer service support. The problem of course is AI has no intelligence and just tends to make things up. It told customers they could have a super great deal if they booked a different flight and cancelled their current one. The airline didn't want to give what customer support promised. It got taken to court and the court ruled AI customer support cited prices and offers and those must be honoured. Shortly thereafter the company dropped using AI customer support having humans handle the job again.

So the US military will never have it make executive decisions. It's too incompetent to ever do that. It will only ever assist in mundane things, like target recognition, guiding missiles, some drone operating, or basic forecasting.
Last edited by Soren; Jul 25, 2024 @ 2:25pm
Never crossed my mind of thinking it at all.
Crystal Sharrd Jul 25, 2024 @ 2:28pm 
Originally posted by Arvaos:
Imagine an AI model calculating every precise battle movement, reaction, and tactic in real time.

In the entire history of our species, humans have NEVER failed to at least try to weaponise every new technology we've come up with.

How do you feel about weaponized AI?
I feel that it will fail miserably.
Morkonan Jul 25, 2024 @ 2:30pm 
Originally posted by Arvaos:
Imagine an AI model calculating every precise battle movement, reaction, and tactic in real time.

Explain...

What sort of magic enables an AI to do that? How can an AI have predictive value in determining a battlefield outcome "in real time?"

In the entire history of our species, humans have NEVER failed to at least try to weaponise every new technology we've come up with.

I think, perhaps, it's that many of the truly outstanding developments can have an impact on military needs, not that every development is "weaponized."

I have not seen B-52's dropping plush couches on the enemy. Are you saying that occurs?

How do you feel about weaponized AI?

To sum up my feelings on the real issues surrounding AI and weaponized uses:

Not one single combatant should be killed by an automated machine that makes the decision to kill. Not one. Every single instance of using a weapon to kill an enemy soldier must be overseen and directed by a soldier. Period. end_of_line

No machine should pull the trigger, itself, ever. Not once. Never.
astroskiffle Jul 25, 2024 @ 2:32pm 
If you prioritise fairness and accountability over mere functionality into it's design then it would probably do a "better" job (i.e. kill more ethically) than a human soldier with the same weapon. But there's also the danger that with too much ethical training it will refuse to kill anyone. It reminds me of the Steinbeck quote "All war is a symptom of man's failure as a thinking animal." And thus the history of war can be viewed as a celebration of human failure.
Last edited by astroskiffle; Jul 25, 2024 @ 3:59pm
Rumpelcrutchskin Jul 25, 2024 @ 5:19pm 
AI controlled drones will be the future of warfare because human reaction time is not good enough, hundred times so when moving to space and using drones in space.
o Jul 25, 2024 @ 5:20pm 
imagine an AI that only calculates your battlefield performance based on how many times your soldiers fart vs the enemy, and nobody notices for 30 years.

once it's discovered people have been operating on the current method of warfare for so long it's not economical to change. they need to pretend 'battlefart theory' is real.

30 years later people have always believed fart theory. there was never a time it was untrue. there is no underlying principle; farts win wars. simple as.
Last edited by o; Jul 25, 2024 @ 5:22pm
Pieshaman Jul 25, 2024 @ 5:27pm 
well the chess computer at max level 25 beat my ass in 3 moves, so I guess were all ♥♥♥♥♥♥ when AI decide we can go.
o Jul 25, 2024 @ 5:28pm 
Originally posted by Pieshaman:
well the chess computer at max level 25 beat my ass in 3 moves, so I guess were all ♥♥♥♥♥♥ when AI decide we can go.

Imagine it beat you by removing any moves that don't lead to a fool's mate, and 'randomly' assigning you the losing side almost every time.

That way you can practice your fartplay teamplay.
Last edited by o; Jul 25, 2024 @ 5:29pm
o Jul 25, 2024 @ 5:30pm 
Originally posted by Winston:
Artificial intelligence is no match for my artificial stupidity!

Yeah we can beat it at its own game!
Steve Jul 25, 2024 @ 7:04pm 
Weaponized autism trumps all.
Pieshaman Jul 25, 2024 @ 7:11pm 
Originally posted by abcd:
Originally posted by Pieshaman:
well the chess computer at max level 25 beat my ass in 3 moves, so I guess were all ♥♥♥♥♥♥ when AI decide we can go.

Imagine it beat you by removing any moves that don't lead to a fool's mate, and 'randomly' assigning you the losing side almost every time.

That way you can practice your fartplay teamplay.

I beat a human player, but he was only 13 year old.
still felt good about myself :P
Last edited by Pieshaman; Jul 25, 2024 @ 7:11pm
Uncle Sam Jul 25, 2024 @ 7:20pm 
Originally posted by astroskiffle:
If you prioritise fairness and accountability over mere functionality into it's design then it would probably do a "better" job (i.e. kill more ethically) than a human soldier with the same weapon. But there's also the danger that with too much ethical training it will refuse to kill anyone. It reminds me of the Steinbeck quote "All war is a symptom of man's failure as a thinking animal." And thus the history of war can be viewed as a celebration of human failure.
The thing is what we do if AI completelly "forgets", any ethics, moral or lawful orders. Instead chose to go full collateral damage killing as many humans as possible? AI should have many kill switches or be limited by many parameters, else sooner or later it might just use nukes against humans.

AI in general is a very dangerous double edge sword type of tech, like nuclear weapons & CRISPR, imo.
o Jul 25, 2024 @ 7:23pm 
Originally posted by Pieshaman:
Originally posted by abcd:

Imagine it beat you by removing any moves that don't lead to a fool's mate, and 'randomly' assigning you the losing side almost every time.

That way you can practice your fartplay teamplay.

I beat a human player, but he was only 13 year old.
still felt good about myself :P

It feels good to practice a skill, and to see ones effect with it.

So sad that this skill comes at the expense of others.
PocketYoda Jul 25, 2024 @ 7:25pm 
Originally posted by Arvaos:
Imagine an AI model calculating every precise battle movement, reaction, and tactic in real time.

In the entire history of our species, humans have NEVER failed to at least try to weaponise every new technology we've come up with.

How do you feel about weaponized AI?
Well yeah isn't that the whole point of AI... Wasn't the whole synopsis of Terminator Skynet that.. It decided the best tactic was to delete the humans..
< >
Showing 1-15 of 34 comments
Per page: 1530 50

All Discussions > Steam Forums > Off Topic > Topic Details
Date Posted: Jul 25, 2024 @ 2:15pm
Posts: 34