Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Supposedly this change results in longer-lived leaders "on average".
It doesn't say "ruler" but since the ruler is part of the gestalt council then that should apply to it as well.
Also didn't even think about certain machine paragons you can find that are over 1,000 years old. I assume they would be exempt from these changes as well, but it only makes it seem more unfair.
Paragons should be immortal by destiny traits which are also outside the normal scope.
They start at age 5-10, having a 100 year lifespan, giving them 90-95 years of active service without any boosts.
Compared that to a normal organic leader, starting at age 27-50, with a 80 year lifespan, giving them 30-52 years of active service.
Or a Lithoid, starting at ages 50-80, with a 130 year lifespan, giving them 50-80 years of active service.
Given that they are "now affected by lifespan-increasing technologies and modifiers" i think it will be rare to lose one of old age.
So where are the extremely complicated machines in real life that lasts forever? There aren't any? Oh.
It also still doesn't make much sense from a practical point of view either, even when their hardware fails it's easy enough to create backups or just transfer the software to a new body, especially for a gestalt consciousness whom relies on said maintenance to survive. There's really no excuse to be losing machine leaders outside of freak accidents or battles.
Do you know how a computer "transfers" files? There is no actual transfer involved, it makes a copy of the file at the new location and deletes the original (well if we are going to be technical it just unlists the file and unreserves the location on the harddrive, so it can still be recovered at this time if you have the right tools).
Now consider that when taking sentient beings into consideration. Would you be okey with your mind being copied to a new body while the original you are killed?
Then we have the whole "the copy process isn't perfect, each copy of a copy introduces new flaws, eventually leading to file corruption", something that isn't much of a issue in most cases or general use of computers, but imagine it if what you are copying is a mind.
Admittedly we are talking about science fiction now and a whole new form of life, so these issues might not even be issues, just trying to point out that there might be issues with the idea of them "just transferring their mind to a new device".
P.S The horror game SOMA plays around a bit with the copy mind concept if anyone has played it.
What do you mean by "extremely complicated machines"? Robots are not alive in the same sense as organics. AI is pure thought, a ghost in the machine. And the outer shell for the AI is nothing more than a tool that can be replaced. Just like people change clothes.
Read the post above yours as to why that idea might have a few issues.
I also don't think a robot would have concerns regarding their personality, especially as a part of a gestalt consciousness. If anything, the leader is the only one who may have concerns, but they apparently wouldn't have to worry about that at all.
SOMA is pretty cool though.
Well i admittedly focused more on the individual robots perspective.
How exactly a hive mind would function is a bit too alien. But the leader drones of a machine gestalt has always been described as semi autonomous drones, as in they act beyond just receiving orders from the hive mind, one could speculate that they would face the same issues as individual robots when it comes to copying their minds.
Even if i admit that its more far fetched in their case.