Instal Steam
login
|
bahasa
简体中文 (Tionghoa Sederhana)
繁體中文 (Tionghoa Tradisional)
日本語 (Bahasa Jepang)
한국어 (Bahasa Korea)
ไทย (Bahasa Thai)
Български (Bahasa Bulgaria)
Čeština (Bahasa Ceko)
Dansk (Bahasa Denmark)
Deutsch (Bahasa Jerman)
English (Bahasa Inggris)
Español - España (Bahasa Spanyol - Spanyol)
Español - Latinoamérica (Bahasa Spanyol - Amerika Latin)
Ελληνικά (Bahasa Yunani)
Français (Bahasa Prancis)
Italiano (Bahasa Italia)
Magyar (Bahasa Hungaria)
Nederlands (Bahasa Belanda)
Norsk (Bahasa Norwegia)
Polski (Bahasa Polandia)
Português (Portugis - Portugal)
Português-Brasil (Bahasa Portugis-Brasil)
Română (Bahasa Rumania)
Русский (Bahasa Rusia)
Suomi (Bahasa Finlandia)
Svenska (Bahasa Swedia)
Türkçe (Bahasa Turki)
Tiếng Việt (Bahasa Vietnam)
Українська (Bahasa Ukraina)
Laporkan kesalahan penerjemahan
I suppose living forever would be depressing for some.
Johnny also has no long term goal for his plans. Sure, He wants arasaka toppled but what then? He has no followup, No idea of how people should get through life without the corps. He believes in anarchy and then mocks the way anarchy looks. I don't think even HE actually knows what he wants besides revenge against arasaka.
Edit: He's also pretty fond of using his friends for his personal gain without caring much about their feelings in the matter, just like the corpos use people for their gain.
This kinda the problem with loosing sight of ones originally intended goal due to revenge and other negative feelings sidetracking said goal. It could inferred that when he was planning the Arasaka raid to destroy the building and evacuate everyone from it, that he also had intentions to do the same to Militech, Biotechnica, and the other Corps whom he believed he could "get rid of the madness you reak" as his originally intended goal, as he hates corps/corpos and believes they are just sucking the land and the people dry of everything they can for power and profit.
Sadly, this is the way with most men whom have designs for changing the world to fit their ideal or reasoning, whether it be for weal or for woe. They already know many will denounce their efforts, and/or even actively work against them, but that still does not stop people from pressing on using whatever methods they believe is necessary to continue down their chosen path to their goal.
Ok, i did not about him serving in the US military. Awesome backlore there. Yea, handling explosives safetly might also be information he is aware of too.
I thought Militech was the one who fought that corporate war in mexico.
A couple Bio's I found, but both said he was in the military and deserted due to fighting in a corrupt war.
https://cyberpunk2077.wiki.fextralife.com/Johnny+Silverhand
https://cyberpunk.fandom.com/wiki/Johnny_Silverhand#Biography
Again, nice backlore finding there. I believe he could make the bomb less destructive with his military training in other words, the bomb may of appeared as a nuke on the outside, but he may of dumbed it down a bit by taking out parts of it inside or even removing explosive yield inside of it to lessen the explosion.
Star trek next generation had a episode about dr. soon with this same thing. But meanwhile real life:
https://www.inverse.com/science/when-will-i-be-able-to-upload-my-brain-to-a-computer
Re: the transporter problem/brain copy problem, I used to feel certain that "just copying a mind doesn't preserve consciousness, it merely copies it." But then the thought occurred to me a few years ago that if it wasn't simply a copy, but a continuous, uninterruped neuroplastic migration, then that question becomes less clear cut imo.
For instance, hypothetical scenario: Instead of simulating a copied brain, we neurally network a living brain to a machine intelligence (presumably a much more advanced neural net than anything that exists today.)
The subject experiences this augmented intelligence in real time, while still alive, as though it's merely an extension of their own mind. They gain new cognitive abilities, speed, tools, etc. but the internal experience of this is a conscious one that they can direct and interface with continuously. Where one begins and ends becomes impossible to distinguish. Where does their consciousness reside? In the brain? In the artificially intelligent augmentation? Both?
Then, gradually over time, the biological brain is allowed to perish, and the digital side of the equation is expanded to house more and more of the conscious experience of the individual in question. Eventually, the biological brain dies, but from the continuous conscious perspective of the person's interior experience, their consciousness is never interrupted, and persists unabated in digital form.
In this scenario, the brain isn't merely copied, but is networked while alive, and the mind "expands and migrates into" the digital system. It then ultimately resides solely in the digital.
It's a Ship of Theseus problem, of course. But it begs the question imo, does consciousness ever get interrupted in this case? After all, we can't even localize consciousness within a human brain today. We have plenty of theories, but we also know that everything from memory to executive function to specific cognitive tasks can neuroplastically shift between different parts of the brain rather than being merely "stored" in one location.
My thinking (currently at least - it's always changing) is, if it's a one-way street where the living brain is merely receiving inputs from the digital system but no conscious functions are running on the digital hardware, then consciousness is limited to the brain.
However, if we're talking about a complex artificial neural system that can mimick neuroplasticity in a way analogous to a biological brain, and if we end up observing actual functions that happen solely on the machine side but are experienced by the networked conscious observer, then consciousness isn't limited to the brain, and isn't interrupted or lost at biological brain death. Especially if there's a gradual process of migration.
Whether this is even possible of course remains to be seen. But it's the one scenario I can think of, hypothetically, where I'm not confident in saying a "digital backup" of a conscious mind is just a copy and not the same person living on after the death of their former body.
When you start to play with the "soul" and nothing bumps in the night or no divine intervention intervenes in the actions of men playing god essentially, it begs the question: Is there a truly a soul, is there truly an afterlife, is there truly higher power(s)?
In the real world when/if such engram technology is invented the social climate around it will obviously be hostile at the very least, because it will make all humanity question tens of thousands of years of spirituality unless science or the technology somehow proves or reveals there is a soul, an afterlife and/or god(s). However without that proof, technology would become a true savior in all practical terms. Why wonder about something that seems increasingly imaginary when in practice we have practical immortality?
As an aside, religions based around reincarnation may have an easier time than ones based around an afterlife for many reasons.
If this is all there is to life (reality and life as it is before we die), then what difference to the universe does it make that your consciousness is an original, a fake or a copy. Regardless of technicalities, process or nuance. In the real world they're all the same and have the exact same impact as a living individual regardless. Unless the perspective of a person somehow disappears but how do we know other "real" people's consciousness' even exist to begin with or how would we know our consciousness is/isn't the same unless were in some heaven looking down on our fake selves?
If there is no afterlife or soul then an original Saburo and a copy Saburo, will both act the exact same way and impact the world the exact same way and feel the exact same way about themselves and their consciousness. A Saburo is a Saburo is a Saburo...unless there's something else to it that can be shown. If it looks like a duck, quacks like a duck and is a duck, it's a duck.
In game, the possible empty/broken personality side effects of soulkiller and V's tarot cards experiences are the counter to the above. Are those tarot anomalies just random code and the brain trying to understand engram process like conscious dreaming? Or is the entire engram process and injection of code and extra processing power inside V's head, awakening her mind/soul to something more as his/her actual self dies/is overwritten by the engram? Messages of cosmic understanding from the afterlife to someone who's already half dead if you will.
Hmm I can't remember so I guess I should ask. Does Johnny ever see the tarot cards on walls? Or is it just V? If it's just V that makes the tarot cards more interesting in that perspective.
Anyways this thread is about Johnny and whether he's a redeemable character or something to hate. He's really both and there's still the question whether Engram Johnny truly reflects johnny as a person too. Or is it a broken ghost as some evidence points to? In the temperance ending engram Jhonny becomes his best self and based on all the philosophical mumbo jumbo I just typed and assuming an afterlife/soul doesn't really exist, could very well be the best ending for both V and Johnny if things don't work out down the line in other endings.
lol it is funny because it happens a lot whenever people talk about Johnny and V on these forums. I guess because we're talking about a dead ghost guy and a walking talking tech zombie. :p
Human is a really good one. It goes into stuff like this on a psychological level.