Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Seriously though, you don't actually know anything about the actual technologies involved, do you?
The earliest I can think of - outside straight up emulators - is WINE with Winetricks and "PlayOnLinux", which now lives on inside Valve's own "STEAM PLAY".
But THIS - current - concept is based around 'giving Windows what Windows wants' inside a Linux box. It creates Windows registries and folders and sub-folders, so Windows executables of games can work on Linux.
THAT kind of 'isolation' is far from the kind of optimization, I am talking about.
I also forgot to mention "OnLive". 2003. Game streaming is not a new concept, but to be on the market at the right time was never easy. Look at Steam machines or Steam Link (the latter became obsolete, because, you can do it now with every smart phone or tablet and Valve themselves made that happen).
Short version, no cloud gaming is not going to work for the vast majority of people because of distance to the servers and lack of fiber optic net connections. Its been tried, and best they could do was 720p with games that didn't require fast reactions times (basically all First Person Shooter games sucked due to lag, all racing games sucked due to lag and so on and so on). Take a screenshot of a game you play now, look at the size of it. Now times that by 30 to 60 times to have 30 or 60 frames per second. Thats the kind of net connection you are going to need, actually more than that because there will be other stuff running in the background sucking up a bit of bandwidth here and there.
Oh and the other thing was HUGE cost like 25 dollars for 20 hours of game play. I just mentioned the 2 main reasons why OnLive failed big time.
Long version, use the search feature for "cloud gaming" or something like that, its been covered a number of times.
I do repectfully disagree with your conclusion though. Cloud/Streaming IS a valid option for a very limited - but big enough - sub-group of early customers. Much more than the "VR" hype train.
Microsoft does not 'just' announce a disk-less next-gen Xbox with 'free games' for subscribers, without doing the math. And the math is on their (streaming/cloud service) side.
Equally, the other big players.
Honestly, I'm not going to read anything past this, because, to be somewhat blunt, someone who says that blockchain is "the most important (software) technology" either doesn't have a clue what they're talking about, or is involved in what is essentially a scam designed to extract money from the overly credulous.
I am fine with that.
You don't seem to use credit cards or buy groceries, delivered through "frictionless trade" via digital customs checks, or care about IBM, blue chip brands such as SAP, BP, Samsung, and shipping giant, Maersk, or the world economy ...
... or did you confuse 'blockchain' - the actual technology used - with crypto-currency schemes?
Anyway. Thanks for reading so far...
I can understand why IBM and the like are selling "blockchain solutions", because companies who sell solutions will happily sell daft things if the customer wants them.
In the vast majority of the things you mention, a traditional database works better, but databases are boring, and don't generate hype or headlines.
I did not, but the only reason anyone cares about the phrase "blockchain" is because of cryptocurrency. The interesting part is not blockchain, but things like proof-of-work, but those are also the things which make cryptocurrencies catastrophically inefficient.
When you strip out stuff like proof-of-work, what you're left with looks an awful lot like appending lines to a text file stored in a git repository, and that is fundamentally extremely boring. It's not a world-changing new technology, and anyone who tells you otherwise is probably trying to sell you something you don't need for too much money.
The NIST report is pretty good, https://nvlpubs.nist.gov/nistpubs/ir/2018/NIST.IR.8202.pdf, you should see the flowchart on page 42.
"Blockchain" is not a relevant technology and is certainly not a 'gamechanger' in any way. So its pretty insulting ot containers to even put that in the same sentence
Lets talk about your 'excursion'
Note p2p doesnt' magically solve any prbolems other than 'I wonder if I can freeload bandwidth off gullible cutomers'.
The idea that a hard drrive 'cannot handle the IO' is basically like saying "the ocean can't handle water". Like you're installing agame, not running a 50 petabyte oracle database. A 5400rpm drive can 'handle the IO' of anything. Lets just say what happened "your hard drive was garbage and it died on you". If your container was on this hard drive, your container would have 'overloaded the IO' on the harddrive and it would have died. Containers do not magically solve the necessity of local hard drives, nor magically make IO go away, nor magically make them immune to failure under normal use
Note tons of people play GTAV without issue. The fact that you need to basically install the prerequisites for a game and reboot is not 'voodoo magic'. Steam runs these for you.
You also dont really seem to even grasp
1) what containerization is
2) what its used for
3) what is required to actually use it
4) why companies use containerization
Your post is literally the most bizarrely misinformed rambling nonsense of containerization I've ever had the misfortune of reading.
Containers are used for several reasons
1) scalability
2) repeatablity
Again the idea is to create containers for apps that you can deploy everywhere. And spin up as needed. This is why companies like Google/amazon use them. As they need resources they spin up identical containers to handle load. When they are not needed they spin down. Containers can be used to simplify large deployments to your systems.
what you seem to not grasp is that in order to do any of the above you need absolute, total control, over the undrelying hardware and OS. Repeatability is pointless unless you know everything running UNDERNEATH the container is reliable. Apps using containers thus are applications that are primarily hardware agnostic. A website doesnt care what cpu is being used just that 'something' is underneath it.
Thus containers solve nothing about your little 'excursion' because containers dont fix
1) your hard drive being on the brink of death
2) your video drivers being borked
Lets also ignore that no matter how you cut it, emulating windows is never ever going to be as good a running bare metal. Wine cannot run as fast as a native x86 program. Wine also cannot emulate things that some games need. which is why games that have anti-cheat programs cannot be emulated in wine/proton. Lets also ignore that getting Wine to work for a game is 90% black voodoo magic, which is why the list of OFFICIAL Proton compatiblity is small. You are literally trading one set of problems, for an entirely different set of more complex problems
The entire part where you ramble on about hardawre, os,, games not being 'optimized' or whatever is literally just a pile of gibberish with no basis in reality. You literally have no idea what you're talking about in those set of paragraphs and it sounds like a Dilbert manager style of gobble ♥♥♥♥. Like this paragraph
Is just PURE GIBBERISH GARBAGE. You do realize that nearly ever program you have on your current computer is, even at its WORST, mutli-threaded. Gaems like Ashes of the Singularity are multi-core from the ground up. Civ5-6 are all mutli-cpu core aware on the rendering side via DX11 and DX12. That's the entire point of things like Vulkan and DX12.
Containeriztion doesnt magically solve your problems because games and many programs rely on OS system level things like DotNet to work. And you cant containerize those because MS wont let you. Sorry does your game need a very specific dll from direct x July 2012? well guess what containerizing the single dll is against the MS TOS. The only way to officially distribute the DLL is to install July2012 DirectX on the systtem. The ENTIRE thing. Same with dot net. Nto to mention modern OS wont let you 'drop' in DLL files exprssly because that kind of garbage was the problem back in the day when every program decided they wanted 'their' version of MCREDIST.DLL for their program causing total chaos as they all tried to delete/overwrite/etc each other's DLL files.
Again I'd recommend you try to graduate from "Dilbert Manger" mode and maybe do a bit more like really really basic research on containerization first. Because you really do not understand how it works, what its use cases are, etc
Glad to find so many experts with free time on the Steam forums, eager to teach and share their 'knowledge' and more importantly their very elaborate opinions about complete strangers.
Back to the topic. Maybe this can help:
Think of every PC game being sold on a separate USB thumbdrive.
Running the game in a container - as I think of it - is the way I installed the OS and GTA on that one SSD. Every game would run in it's own - isolated(!) - version of the OS configuration and own hardware driver version, but on the same hardware.
It would hence NOT be hardware agnostic, as - yes - all common containers are.
But it would be bound and only running in it's multiple version on ONE, known, hardware.
Think of it as a single Steam PC with 100+ Windows installations on it.
There is no library sharing outside the container.
Instead of multi-booting into two or three Windows 10 versions, you would multi-boot into hundreds of installations of Windows 10, which runs only a "Game x". Every game would be shipped with it's own OS version (adjusted to the hardware requirements on the USER side, not the developer side).
It would allow the user to 'switch' from one contained(!), isolated environment to the next, based and pre-configured for each game. All non-gaming, non game-required Windows services would be not just running at 'lower priority' but would be completely killed.
It would have to scale, but scale on a single machine (and scale, in the sense that it has to fit to each users hardware. THERE is a challenge). And it would have to be repeatable, since it has to function the very same way for the next user with the same hardware.
There would have to be thousands of pre-configured, pre-compiled 'patches' so the user's hardware gets the exact one for them.
The Operating System delivering on this idea may very well NOT be "Microsoft".
You don't seem to realize that what they mean by "discless" does not mean it will not have a hard drive. It means it will not have a dvd or blu-ray drive in it. It will still contain a hard drive, and you will need to download the games to play them.
https://www.kotaku.com.au/2018/11/microsoft-is-working-on-an-xbox-one-without-a-disc-drive-report/
As to the "containers" why? That just adds another layer of software on top of the game, that will require more computing power all to do what, make sure it works on your system? Most if not all games are designed for PCs. Even the latest generation of console games are designed to work on PC hardware because thats whats in the latest consoles.
Its simple enough to make it work on a windows machine and the porting that has to be done is including support for all the stuff PC gamers are used to having that console gamers don't need.
The only ones that might need the containers would be Linux and Mac users but they are a tiny fraction of windows users and Valve already seems to have something cooked up for them or at least linux users, not sure if it also works for MACs.
If the "containers" are local, its a waste of processing power for the majority of users. If the "containers" are on a remote server, well thats just cloud gaming and we've already covered the issues with that.
Yup, this sounds like a nightmare for people with not so powerful computers. Running an OS on top of their OS while running a game...
It seems overly complicated for something that will not improve the users enjoyment of the games.
It would only run ONE Operating System - the one it comes with.
But ... I finally agree with you; it DOES sound complicated and is probably way to complicated and not financially feasable to create.
... though, even now, I cannot give up.
Imagine a Steam user, having the exact same hardware as you (mobo, cpu, gpu) and he tweaked his Operating System to run 'optimal' for the game (high framerate, never crashes, runs smooth).
Now, imagine THAT guy, sharing his "Windows Installation" which includes the game with everyone else (legally) on the web, or in Steam Workshop.
There would be no more "it doesn't run" negative user reviews?!
:)
There's a reason gaming on windows went with Direct X as a standard. Before that developers had to manually code in support for various brands and models of video cards, sound cards, modems, montors, etc. It was not fun to buy a game and see that your sound card was not one of the supported models.
Software already runs in it's own little space for the most park. It has its assigned memory which other programs cannot tamper with, it has its share of the CPU based on it's activity level. A VM is basically the equivalent of taking a two person conversation and sticking a third person in the middle. It works but not always as well and the more time seinsitive the game is (reflexes required etc) the worse it gets.
Also there would still be .. it doesn't run reviews because quite simply the same problems that cause that would still exist. Driver incompatability, hardware incompatability, insufficient hardware specs... except worse because there would be no universal model or architecture.