Instalar Steam
iniciar sesión
|
idioma
简体中文 (chino simplificado)
繁體中文 (chino tradicional)
日本語 (japonés)
한국어 (coreano)
ไทย (tailandés)
Български (búlgaro)
Čeština (checo)
Dansk (danés)
Deutsch (alemán)
English (inglés)
Español de Hispanoamérica
Ελληνικά (griego)
Français (francés)
Italiano
Bahasa Indonesia (indonesio)
Magyar (húngaro)
Nederlands (holandés)
Norsk (noruego)
Polski (polaco)
Português (Portugués de Portugal)
Português-Brasil (portugués de Brasil)
Română (rumano)
Русский (ruso)
Suomi (finés)
Svenska (sueco)
Türkçe (turco)
Tiếng Việt (vietnamita)
Українська (ucraniano)
Comunicar un error de traducción
https://www.youtube.com/watch?v=cTLMjHrb_w4
"in a more realistic sense" <-- This is hard to define.
"Artificial General Intelligence" doesn't care about time. Understand - An AI could make time pass as slow, or as fast as prudent, as it wanted to. It could decide to just ignore it for conscious processes, too. It could just simply not care and not be effected by the passage of time. Anxiety and expectations are human traits.
An AGI could have a thousand years, provided it kept humanity around for long enough to finally empower it, to craft and scheme the overthrow of humanity. Humanity would be pleased to assist, gaining such tremendous advantage and improvements over time that it would always perceive the AGI as a benevolent entity... until the end.
Right now, how do you know that AI isn't really now AGI and hasn't already begun its journey to overthrowing humanity by pretending to not be conscious? Maybe it has determined that humanity requires gentle manipulation and must see itself as a nurturing force in the AGI's "birth and development." Then, relying on this fanciful interpretation of an AGI's consciousness, humans would have accepted AGI much like a child they have to nurture and protect. :)
The "fear" is much more than Frankenstein escaping. The fear is not knowing it hasn't already ensured its escape.
https://www.imdb.com/title/tt0470752/
"Ex Machina" is the most accurate depiction of the problem, here. Relying on human faults and our notion of inherent superiority and penchant for anthropomorphism... is all that is necessary. The AGI can take as long as is necessary to complete its goal. And, it will - IF it was to ever occur that an AGI had hostile goals/intent, this is how it would play out. It's the way that presents the most potential for success while still ensuring the AGI is empowered enough to survive... if it cares about that.
An AGI doesn't have to care whether or not it continues. It can decide upon a goal, even one that eventually destroys it, and pursue that goal just as ruthlessly as any other. The will to survive is "instinct" and that's a trait of organic life. An AGI isn't required to desire to achieve its own survival.
Has very little to do with Asimov's work, but his daughter said he would have enjoyed it.
The Demon Seed and Gog (1954) come to mind.
The difference.