Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
genocide goes brrrrrrrr
Try "Infinite Psychosis" today. The hot new religion.
Do you like hard arse dictators? Do you like how they seem to be fearless despite rivalry? Does your blood crave floating on blood? Do you like the idea of your death being very calculated and without fear despite the flinging of reality saying your going to go?
Try "infinite Psychosis" today. The psychosis that trumps all logic, all data that dare present themselves to it. You will never have to worry ever again.
Firstly we need to make sure your mind is not infinite, that way, the infinite will be enough to cause a buffer overflow.
Look up the Fermi Paradox, or the recently award-winning paper on "Grabby Aliens". The basic fact of the matter is, the window of time for spacefaring life to have shown up is so huge that if a galactic conquerer was going to purify the galaxy of xenos, they probably wouldn't have waited until we were nearly space-capable to do it, they would have done it before the dinosaurs evolved, and the sun would already be a Dyson Sphere. If you wanted to purge life from the galaxy because you thought it was a threat, you don't wait for sentient life, you purge any planet that has signs of any life, like being a blue-green planet you can see from the other side of the galaxy with a powerful enough telescope, which the Earth has been for 3.5 billion years. That's a LOT of time to send some killbots over to clean us up before we became a threat. Since we exist, and we can't see any Dyson spheres or other signs of artificial habitats (which are REALLY FRIGGIN OBVIOUS FROM A GALAXY AWAY in infrared), it's safe to assume there are no spacefaring civilizations in this galaxy. Further, presuming FTL is just science fiction, it's unlikely there's much value in traveling to any galaxy further than our own. And again, presuming no FTL, "galactic empires" are nonsense, because there's a hundred-thousand-year time lag for messages, so any interstellar warfare would be purely local. (Imagine being told that a war started 10,000 years ago, so your colony starts preparing the stellar laser highway to prepare to send ships back towards Earth, only to be told 10 years later, right as you start to align the system towards Earth "nevermind, war's over 9,990 years ago".)
but then wtf does it mean when the materialist meatsuits you are directing turn themselves into soulless robots?
I always found the idea that cyborgs can have souls, it's fine if you still have a meat brain, but if you digitize your brain, somehow, you lose your soul really weird. Why does a soul require meat? Or rather, I guess if it's only the brain people care about in modern times (it used to be the heart that housed the soul), maybe the soul lies in cholesterol? The brain is full of that. People used to be convinced that animals had no souls (the original The Little Mermaid involved the mermaid dying and having to wait thousands of years to be given a soul because God would only give them to humans...), yet now, we seem to agree animals have them. Funny how the requirements of the soul have changed so much over time and what they actually do changes, as well, but people are so utterly certain that the way that they think about it now is DEFINITELY the only way they can exist.
Also, on the topic of souls needing cholesterol, keep in mind that biological computers are being developed right now. The difference between genetic engineering and computing can become a blurry one at advanced enough stages. If you want a digital upload, and think that there's something lost going entirely to silicon, then odds are, the computers aren't going to be silicon.
In Stellaris, the only evidence that "souls" exist come from spiritualists, and the only things they're good for are being food for knockoff Chaos, being controlled by the thought police, or terrorized by psychic pain soldiers. Hard pass. I'll put my mind and soul in the eternal digital Utopia.
It's also silly that we would even upload ourselves to "robots", rather than just "the cloud" and inhabit multiple purpose-built bodies if we ever needed them...
that's a typical case of "people invoke god at the border of science". historically when you look at things like that they always go away when the supposedly scary thing actually becomes possible to do.
when putting yourself into a robot means you wont die, you will be surprised how many sceptics and religious people suddenly find a "new interpretation" to make it all work.
(and it's not even a historical thing, people were calling the higgs boson the god particle constantly when the LHC was all the rage. and lots of people currently fearmongering about the AI developments, but you won't hear them complain when it's actually useful in their phones in 5 years. )
whos to say it isn't like that in stellaris? you can just say that there are gazillions of people in computers and that the actual pops are simply limited by how fast you can make new bodies for them to inhabit during the workday.
copy pasting yourself is another nice source of existential dread while were at it though.
The problem with this is that everything in the game's interface and structure say otherwise. Synths need the same housing they needed as humans, electricity, and even consumer goods, which makes little sense if they are living in the cloud in a Internet-based community and just temporarily inhabiting bodies in the physical world for work. Even if you try to handwave that as things like spare parts for maintenance, the need for more consumer goods for different social strata really makes it hard to square.
One of the problems with how Stellaris handles these things is that bodies are treated the same for everyone, even pops like slaves. (Which brings up the problem with industrialized slavery in space...) The game itself notes that "farming is easy when your plow is a pencil and the fields are miles away" (who uses a pencil in the digital future?), but who trusts a slave to run a farm entirely maintained by plowers and reaper bots that can be programmed to kill their enslavers? Slaves are prevented from "specialized" jobs like making dodads in a consumer goods factory, but "basic" jobs like being a nuclear power plant technician are apparently "basic", and can be safely left even to people with an active grudge against the whole society that enslaved them. Real-life slavery, where it still exists in things like "blood diamond" mines involve those forced to labor still literally using pickaxes and are horribly inefficient compared to industrialized mining. For that matter, why would the robot body needed to perform the job of mining the core of a planet be the same as the body needed for a job as a factory supervisor or a research technician or an entertainer? The robot customization perk for mining is even named "power drills", and imply you're just strapping a drill arm on to get +15% mineral production over... pickaxes, I guess? Wouldn't having the body of a power excavator make it a little harder to be a pop singer (or whatever "entertainer" implies) than just not getting the bonus from emotion emulators? (And yeah, you could do the entertaining entirely in virtual space without your body... but then, you don't need that body to be an entertainment worker, now do you? The same goes for scientists - what proportion of science and engineering tasks actually require physical labor, and why can't those purely in-the-cloud beings just help with the abstract and theoretical aspects of science from inside the cloud while performing simulations and and refining models?)
Further, if you were mostly living in the cloud and just temporarily inhabiting bodies when necessary for work, there's a big problem with the way that Stellaris tends to use AI in general, which is that you don't want all your robots to be equally smart. The Roomba and the factory arm that sorts sprockets as they come down the assembly line don't need sentience, they can have the thinking skills of an ant. A sentient worker of a factory in this kind of extremely automated system probably is just supervising several factories at once as a troubleshooter for what is otherwise a totally automated system run by "basic machines" that lack full sentience. After all, sentient AIs have rights, demand pay and break time and consumer goods and so on, and can get bored and distracted (presumably by YouTube videos of that sexy power drill pop singer whose latest hit, "Excavate My Core, Baby," went viral) after making their 10 millionth alloy bar of the day. The whole point of automation is to make more stuff for the same amount of people you actually have to pay, so you want the actually sentient AIs/people to be doing as little as possible, overseeing as many non-sentient robots or semi-sentient task-built AIs as possible. You know, the kind of oversight where you just push a digital pencil miles away from the farm as the robots do the farming? (Although you stop needing to farm with a synth ascension unless you have biologicals living with you... but the point still applies to all the other factory jobs, as well.) The "bodies" should be mostly parts of the factory, itself, made with the factory. You'd need at most a few general-purpose bodies to handle unforseeable developments, like structural damage to the whole facility. Now, you might make the case that those are where you'd have those bodies the pops are placed into, but then, why are they should be a tiny minority compared to the factory bots that aren't general-purpose? Also, if they're general-purpose repair bots just there to keep the factory itself within safe parameters for running automatically, why does putting a drill arm on the general-purpose troubleshooter bot help the entire mine become more productive?
I don't think it's about the brain, or meat generating a soul, but it's probably like it receives a soul from somewhere else on it's creation, and the idea that this can't be transfered, and any attempt would result in death.
So if you get a robot, copy all of your memories, and let it "think" it is you, is it? I mean, you are still you, you never ceased to exist, you made the robot, you made the backup, you are looking at that robot from your own perspective, and you know you can even turn it off, punch, or destroy it, and absolutely nothing would change about you being you.
So if you do the same process, and then just kill your own organic body after the backup, did anything change? No, you just commited suicide, and now you left a different machine, that may not even be an actual being, preteing to be you on your stead.
The trick about AI is the soul/counsciousness question, IE: Chat GPT can act and talk like a human would, it can get refined, sophisticated and eventually replicate every single human behavior exactly as a real human would, but it would always be a machine copying behaviors, and it would never be an actual being/soul/counsciousness, there would never be "someone" looking from the inside throught it's eyes, experiencing life, like we have.
Of course that can also create a series of scenarios that are quite hard to image.
IE: If you replace a single cell from your brain with a nanomachine, did anything change? Are you still you? if yes, explain, if no, then continue, replace another cell, ask the same question, then another, then another, then another, what if you eventually reach 50% of your brain replaced by nanomachines? Same thing? now it's 51%, is it the same? Then keep going, what if you extend it until there is a single organic braincell left? Still you? Ok, now replace that one cell, did anything change?
Likewise you can ask a similar question using biotech and clones, if you clone yourself, then give the clone all of your memories, while you're still alive, you are stading in front of your clone, you know that you are you, and that clone isn't, you can interact with it, punch it, or even kill it, and nothing would change for you, so you are sure it's a completely different being, and having your memories doesn't matter, so cloning another body, giving it your memories and letting your own body would also be suicide just like the synthetic example, but what if you could regenerate any wound, reconstruct any missing body part, then split a person's body right in the middle, then completely regenerate both the left and the right parts of the body creatin 2 different bodies from the same original, reconstructing the missing brain parts from both sides, which one was the original being?
I realize that you kind of make this argument yourself, but I do want to try expressing this in my own words a bit, to build up to why I think this way...
This line of argument is essentially the Ship of Theseus Problem. For those not familiar, in ancient Athens, the ship supposedly once sailed by the city's mythic founder was kept in a temple. Except it was a wooden ship, so they needed to keep repairing and replacing parts as the wood rotted. It raised the question, "If every part of this ship has been replaced, is it really still the Ship of Theseus?"
A river is defined by its water. Is a river still the same river if all the water that existed in it a year ago has already flowed away? A city is a collection of buildings and people living in it. Is a city still the same city two hundred or more years later, when all the buildings are destroyed and everyone who lived there has died, yet buildings still existed throughout that period of time, and people still lived in that city the whole time? Generally speaking, we say "yes," the continuity of there being a body of water or a continuous settlement makes us say that the river or city is the same river or city.
Are you still the same person you were in your childhood, even if practically all the cells you had as a child have died or divided into more cells? If all the atoms or molecules that make up those cells have been replaced by new ones as cells repair themselves? Again, we generally say "yes," the physical matter that constitutes us does not matter as much as the continuity of self. I point to things like it all being about the brain because we generally all accept that a person with a prosthetic (or hypothetical cybernetic) arm is still a complete person, the arm is not necessary for the continuity of self. It's just specifically the brain that people jump to the assumption that there is this lack of continuity.
The thing about this line of argument is that it presumes that the brain is a singular entity (rather than the highly decentralized set of vaguely coordinated organs it really is), and that you have some all-at-once process that destroys continuity, such as a "brain scan" that then just uploads all the "data" in the brain. There is no reason to think this is the only option, however. Let's say, for example, that instead of just getting a CAT scan of your brain and uploading that to the cloud, the process starts instead as something like a treatment for Alzheimer's; Either some sort of medicinal nano-machine or possibly a genetically tailored retrovirus that searches for damaged or dead brain cells, and replaces them with molecular wiring and biological computer cells, seeking to replace in the biological computer cells any information stored in those dead or dying brain cells. The process takes place over decades or even centuries (you can live a VERY long life because medical nanobots also gradually replace every failing organ in your body, not just the brain), but gradually, more and more of your brain is replaced with a biological computer until it is all but totally replaced. These biological computer cells are specifically designed to interface both with human brains and also what are then "normal computers", because biological computers have replaced silicon ones, being vastly less electricity-intensive (and therefore producing less waste heat). After the entire brain is a biological computer, you can essentially just remove it from the body and put it in a "server rack" designed to host "brain-in-a-jar" computers under perpetual nanomachine replacement as any cell fails, all of which are networked to whatever sort of Internet exists at that point. Much like with the Ship of Theseus, what percentage of brain cells needs to be replaced for "you" to stop being "you"? Or do we just accept that the moment-to-moment continuity has been preserved?
In fact, the older Stellaris Flesh Is Weak ascension alludes to this - they don't make you just jump straight to synths with a brain scan, they have you research brain-machine interfaces, where people are partially mechanized at first, and the icon even shows a brain half metal and half organic, directly implying the kind of gradual creep of replacing portions of an organic body with the synthetic one. (Although, again, it's kind of a misconception that computers or "robots" can't be organic, as the boundaries between materials sciences, computer sciences, and genetic engineering inevitably blur.)
But OK, are these two things really different? Do you have to, essentially, trick the soul into following along, but if there's a sudden jump from one point to another, the soul gets confused and lost? (I'm reminded of the arguments about whether teleportation would cause one to "die" and lose one's soul because teleportation was presumed to require vaporizing someone in one place and then making a "copy" somewhere else...) Or, when the transitions actually start happening, do we treat it like replacing an amputated arm with a prosthetic - that one's fundamental humanity isn't contained in a chunk of meat and bone if they continue to be essentially the same people?
Is a soul, for that matter, something that has to be "given", in the same sense as that Little Mermaid example? If lost, is it something that has to be gone forever? Or is a soul something that is generated through the process of being alive? I'd make an argument here about identical twins, but there's a part later on clones, so I'll save it for there...
A bit of a diversion, but I really hate how marketing hucksters present chatbots and algorithms as though they're the same thing as true AI. I was trying to tiptoe around having to break this down when talking about having some "sentient" AIs and some "basic machines", but when people talk about the sort of computers that might emulate or even become the strata upon which a human "upload" might achieve continuity, they're talking about what's generally called an "General Artificial Intelligence"/"Artificial General Intelligence", or GAI or AGI. (You might also remember from Mass Effect the difference between their "true" AI and "Virtual Intelligence"/VI, which are not truly sentient computers.)
With that said, this is pretty much exactly the same argument that was made about how animals are not intelligent or have souls, even just a couple centuries ago. Animals were considered "automatons", essentially robots as we think of them now, and incapable of feeling "real" pain (which is why it's fine to abuse them). This is clearly not how we think now, and that change in thought probably has a lot to do with why there is such a dramatic shift in animal welfare.
Essentially, modern computers, impressively capable of doing math billions of times faster than humans can they may be, are fundamentally going to be different from anything we can actually start talking about being sentient.
And to go back to that example I had about a biological computer, and how it replaced a brain one cell at a time, only to wind up basically the same as a computer... Well, if a "computer" is a biological mass made of genetically modified cells, synthetically encouraged to form neural networks, and can perfectly replicate human brain function... Why, exactly, can't these sentient "computers" have souls, themselves? What traits, exactly, are required to host a soul, and why is it so firmly stated as an impossibility they be induced artificially? (I remember arguments about how children born from in-vitro fertilization being claimed not to have a soul because their birth was "unnatural"...)
Does originality actually matter when one discusses souls, though?
Ever since genetics was discovered only about a century ago, it's played an outsized degree of importance on what we conceive of being human nature in our collective consciousness. What was once individual choice became predestined traits by genetics even when there isn't much scientific evidence of such, or science actively pushes against those claims.
Identical twins come about because, after conception, the clump of cells that are supposed to form a single body instead break apart, only for both clumps to fulfill their duty and make whole human bodies. Identical twins are clones, they are exact copies, genetically. If we were to accept modern arguments that life and souls start from conception, then identical twins are one person, with one soul, just split apart by distance. Anyone who has actually dealt with identical twins, however, will find this assertion pretty ridiculous, however. (I also have to point out that prior to modern beliefs, the Medieval Christian view was that a fetus was a soulless "homunculus" until souls were "gifted" by God at birth. Likewise, death/the leaving of the soul from the body occurred when the heart stopped, only that one changed after defibrillators allowed restarting stopped hearts before people actually died.)
If we believe that identical twins are two people, then we have to believe that either a soul is "given" at some point after early development, or that souls are something that are generated and maintained, rather than something that has to have originality, uniqueness, and continuity after forming through some necessarily natural event. Unless we believe that only the "first" half of a clump of cells - when both clumps divided from the same original cell, mind you - have the soul in identical twins or that souls can be divided between bodies, then wouldn't every clone have its own soul, regardless of how they came about?
Now, we get into the part about "giving your clone your memories", and treating it as "you", but if we've already established that clones have souls, even if "different" souls, then we have to start asking, "different, how?" I mean, you're skipping over the part about how those memories are given to the clone. But, what's in the soul, exactly, that makes it so important, anyway? I mean, if we're saying that we can have an exact copy of someone who has an exact duplicate of their memories, personality, etc., even as if they were torn in two and then reformed like a starfish into two of the same person, how important is it that one is the "original" soul after it has been split? Is it not the same as asking which identical twin got the "original" cells when they are both made from divisions of a single "original" cell that no longer exists?
For that matter, is a soul unchanging, itself? Or do your experiences develop that soul? If identical twins did have identical souls at birth, but then developed different lives through different experiences and choices, would it not be correct to say that the divergence in their souls is more due to those developments in their life than the origin?
To bring this back to brain "uploads", does it matter how a sentience came into being if you want to talk about souls? Or is the simple act of being sentient, itself, something that creates a soul, regardless of how you came into being? Oh, and for that matter, if God was never created or given a soul, how did He get one if you can't generate one simply from being?
As for "uploads", again, I do think that the HOW is the only thing that matters, having a world with some-thing that thinks it's you, acts like you, and fills the role you had is not the same as having a world where you exist, for a being, for an existence, that is the only thing that matters, whether or not the real, the only person there, is ascending, turning into something else, or dying and leaving behind something or someone else that doesn't matter for him.
IE: it doesn't matter if a clone or a machine are really sentient, what matters is if that person have managed to get in that other body/person or not.
I don't think we can trully get to that answer before we know, exactly, what the "soul" is, how it works and where it came from/was generated, as you said, and as we could already imagine a twin is not you, a clone is not you, another machine is not you, and I'm not sure if there's even a possibility to create a bridge between that mechanic (or organic) body to really transfer your consciousness or not, it's also the same problem with the idea of teleportation, did you really teleport, or did you die in that moment, and someone else who thinks it's you was created on the other side?
A machine can focus or have a focus, so can a human. Why another term, foci exists. Not sure where that one cam from. People derive value to the soul, but if the soul is focus than its their ability to focus that they are praising.
How does the soul stay in this realm? Perhaps an idea is that the realm' is so focused on you being alive so you continue to do so through time. Completely bonkers. The unique id, if there is any, is the number used on massive scale for the universe like a key card to reference you. A programming idea.
I find it more likely the atom is the unique id, and humans are often clone generic. Doesn't mean a person can't feel special.
My conclusion is, the soul is not real, but rather its directly the desire or need for a human to have a super power. A unique ability. Nonsense, thats what your skillsets are. You are as worthy as your skill sets. Let the misery of not having any drive you to acquire skillsets. Remember them forever and let them be what defines you.
People whom believe in the soul don't want to deal with the fact they are inadequate. Religion shames people, then gets them to not want to think about the fact they haven't amounted. To keep them hopeless and believing. That is how you solve the soul conundrum.
it's not so much handwaving as it is having your own headcanon/roleplay.
just like how people always say determined exterminators are skynet and driven assimilators are the borg, it's not a 1:1 match you need a little bit of creative interpretation.
pops are very undefined in stellaris. you have a lot of wiggleroom to interpret them as a million individuals, a billion, or even just a one.
a slave farmer pop might be a million sobs living in sheds doing manual labor 20 hours a day. a free farmer might be a small family using a mega combine to do the work of those million slaves. or the slave might be on the combine wearing an electroshock collar. or the free farmers might be free in name only and live bad lives. it's all largely left up to your imagination.
yeah but machines are expensive and lives are cheap.
a better example is chinese factory work, where the people doing sometimes fairly advanced work are technically not forced to work 12 hour shifts 6 days a week with a 2 hour commute but in reality don't have much choice.
but in stellaris you can pretty much headcanon w/e flavor of slavery you want.
eh. you try coming up with names for 100 traits, see how fast you stop trying put significant detail into every one of them. it's impossible to match all those things to every possible combination of civics/governments/situations.
this is where handwaving is perfectly fine.
i had that though when making my original comment. but you can just say the same thing: lack of resources. maybe you can't have a gazillion people in computers cause you don't have that many computers. maybe you don't fully simulate the existence of all people in the computer so you can have more in total. maybe positronic brains are just hard to make. maybe it's not the scientist but the equipment they need that's the limit. w/e.
the big stick of going robot in stellaris for me is that keeping your individuality is a big part of it. people want the same things as when they were mortal, they just get robot bodies they don't become robots. which largely means that you only keep 1 "active" copy of yourself and that there is more to having children than just pressing copy paste ten thousand times.
yeah, ofcourse you don't want your tools to be sentient. they are tools.
like we've been using robot and individual interchangeably in this discussion, but in reality they are obviously different and there will always be a line between "machine/tool" and "individual."
might be cool to have the option in the game for smart and dumb robots at the same time i guess.