Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Hell, if I can string your logic to its absolute extreme, would it even be moral to close the game if all their lives run on your local machine? You'd snuff out hundreds or thousands of lives any time you need to go to sleep.
Killing a sentient A.I. in a video game would almost certainly fall into the same category. the on screen persona would "die" but the computer isn't going to scrub and erase that persona from it's code. It will simply resurrect to re-spawn and carry out its role.
Also, it's created to die, much like sheep, pig or cow. They are sentient, yet the vast majority will scarf down a whopper or similar with no philosophical issues.
Now, its possible that we'll see an AI serving the role of 'Dungeon Master' in the future.
However, designers have been using very rudimentary AI since Atari, so that is simply going to continue to grow. Behavior algorithms, procedural generation, etc aren't going anywhere. ChatGPT is really just super advanced procedural generation that's been informed by tons of external content. That kind of AI, yes, will develop further.
Yes. Think that would be freaking awesome to have AI that helps to create more “living world”, especially if it’s cost effective software that allows devs to kind of fire and forget and focus more on game design, content and art than programming.
Also assuming they just get/can be reset like Star Trek holo deck and are programmed to be oblvious to the “fourth wall”, meh, no more f’ed up than roleplaying same with bots we have now in my opinion.
Yeah sure. The A in A.I. stands for Artificial, it ain't real. So kill and torture to your heart's content.
That said you don't really need sentient A.I. to model that at all. Even the most basic open world game already has NPCs that run the moment you pull a trigger within a certain proximity. Its in the context where they get wacky, like where cops in NC will shot at you for engaging in a firefight with gangers in their presence.
A more pertinent place to apply it would be in activites, conversations and reactions with the secondary NPCs who hang around your character but are not driving the main plot. That way you have this sense of a life outside of wake up, get chromed, do drugs, commit murder and mayhem, rinse, repeat.
If they wrong me, I want them dead. If they don't wrong me, I couldn't care less what happens to them.
As long as they leave out the more annoying aspects of human behaviour like arbitrary trolling, abandoning any regard for grammar, going into a frothing rage because someone criticised their favourite thing, assuming anything said by anyone is a personal insult and using it as an excuse to throw a tantrum, etc...
On the topic - I think the question cannot be answered because the question makes no sense. To elaborate:
-- By the time we are at AGI, solving moral dilemmas in video games will be the least of our concerns. Our survival IRL will be our top priority in case of any alignment issues (which is set to 99.999+% to happen because we can't solve the alignment problem even among ourselves - humans)
-- Even if AGI is miraculously "harmless" and indeed helps humans to advance and serves "for greater good" (naive hope because different humans have different thoughts on what's "good") - we will have to deal with post-scarcity economy and nobody knows how it will look like. What will humans do when AI can do every job, far cheaper and better than those humans? How to realign the whole society and .. well, life - in a world where humans can no longer be economically productive?
-- If AGI is "harmless" and it is put to a "greater good" then it is only a matter of (a very short) time till that AGI will be used for advancement of AI meaning super-intelligence will happen and then some sort of tech singularity which will outsmart everything we can imagine. And even everything we cannot imagine. It will be like ants trying to comprehend current human intelligence. It's impossible. We have no clue what will happen to us and the world then, but chances of us playing video-games (or us even existing - at least, in our current state) are pretty slim.
-- Even if we somehow mitigate all of this - i.e. we don't get super-intelligence, we don't run into post-scarcity issues, we have a "roughly human-level AGI" - then the dilemma will start way before integrating it into video games. Questions like "if it is sentient, what rights does it have?" or "Can we even call that 'it' since it's sentient?", "Is 'it' alive?" will be very tough to approach. No developer in their right mind will integrate sentient AI into a setting where it will antagonize humans. It's bad for the whole "alignment problem" you know.
TL;DR: No, we won't be shooting emulated living beings in video-games. Aggression, suffering and pain is bad no matter the setting, AI would be no different.
That was a good read dude, thanx